Saturday Sep 15, 2007

ISMIR, the annual Music Information Retrieval conference is just a week away.  I'm really looking forward to it.  It is always a great learning experience, a super opportunity to meet lots of energetic, and very smart people that are passionate about music.  This year I'll be busy too. I'm presenting with Oscar a  music recommendation tutorial, two poster sessions (Thierry is doing all the hard work on one of them), and I am a chair for the session on recommendation.  This will be only my second time acting as a session chair.  At last year's ISMIR I learned about what my duties are as a session chair.  There are really just two things: make sure that the speakers end on time, and make sure that during the Q&A after a talk that if no one else asks a question that you ask one. Apparently, every speaker needs at least one question to feel fulfilled as a speaker.  Easy enough, end on time and ask a question if needed.

It was in my role as session chair that I had my worst ISMIR moment.  I was doing fine making sure that the speakers ended on time (even when we had to swap speakers around when one chap couldn't get his slides to appear on the projector).  However there was one speaker who gave a talk about a topic that I just didn't understand. I didn't grasp the goal of the research,  the methods, the conclusions or the applicability of the research.  All the way through the talk I was wracking my brains trying to eek out an appropriate, salient question about the research. A question that wouldn't mark me as the idiot that I clearly was. By the end of the talk I was regretting my decision to accept the position as session chair.  I could only pray that someone else would ask the required question and save me from humiliating myself and insulting the speaker.  The speaker concluded the talk,  I stood up and thanked the speaker, offered a silent prayer to the God of Curiosity and then asked the assembled for questions. Silence.  Long Silence. Really long silence. My worst nightmare. I was going to ask a question, but by this point I couldn't even remember what the talk was about. It was going to be a bad question, something like "Why do you find this topic interesting?" or "Isn't Victoria nice?".  Just microseconds before I uttered my feeble query, a hand went up,  I was saved. Someone asked a question.  I don't remember the question, I just remember the relief. My job as session chair was complete, every speaker had their question.

 This year, I think I'll be a bit more comfortable as a session chair.  I know the topic of the session quite well (recommendation), and I know the speakers too, but still, please don't be offended if I ask you "why do you find this topic interesting?"



 

Well, it looks like playlists were patented back in the late 90s, and the company has decided to sue - I guess we are all going to have to start playing our songs one-at-a-time.  Oh no, wait, perhaps I can patent the Playqueue, or the Playstack, or even the Playheap.  Via BjornW

Friday Sep 14, 2007

Here are two sets of music recommendations, one is created by a machine (a music recommendation algorithm that we've created here in the labs), and one created by a professional music critic.  Can you tell which is which? Which one is a better list? Post your answers in the comments.

Recommendation List A: 

If you like "Miles Davis" you might like:

  1. John Coltrane
  2. Duke Ellington
  3. Thelonious Monk
  4. Charlie Parker
  5. Herbie Hancock
  6. Bill Evans
  7. Charles Mingus
  8. Sonny Rollins
  9. Wayne Shorter
  10. Weather Report

 Recommendation List B: 

If you like "Miles Davis" you might like:

  1. John Coltrane
  2. Weather Report
  3. Herbie Hancock
  4. Wayne Shorter
  5. Charlie Parker
  6. Dave Douglas
  7. Chet Baker
  8. Tony Williams Lifetime
  9. Can
  10. Sly & the Family Stone 

Thursday Sep 13, 2007

The Amazon music store is soon to be unveiled - and soon afterwards we will all know whether it is going to be a serious competitor to iTunes.  Here's what Amazon needs to do  to compete:

  • Work seamlessly with the iPod  - If I, (or more importantly, a non-digital-music guru), can't get the music from the Amazon store onto their nano or their shuffle, then the store is a non-starter.
  • Broad and Deep Catalog - The Amazon bookstore is a poster child for the long tail - they better do the same for music.
  • DRM-Free - Rumor is that Amazon is doing the right thing here ... no DRM for their tracks.
  • Discovery and Recommendation - Amazon is well known for its recommendation and personalization technologies.  If they can make it as easy to find new music as they make it finding new books then Jeff may give Steve a run for his money.
That's what they need to do. Here's what I hope they will do too.

 

I'm excited.  I just got my physical copy of David Jennings' new book 'Net, Blogs and Rock 'n' Roll'.  The publishers did a nice job - it is a very pretty book (with a nice index too).  I'm looking forward to re-reading it in the next week.  As I said to David (which found it's way onto the cover of the book as a blurb), this book is a super read that should be on the shelf of everyone who cares about how people find new music and media that matches their tastes.  This really important book has changed the way I think about music listeners.

I've had an excellent response so far to the music recommendation survey that I'm conducting.  I've received nearly 200 responses and have collected nearly 10,000 datapoints.  This data will be extremely useful for evaluating music recommenders.  There's all sorts of interesting data in the survey. For instance, people think that if you like The Beatles, you'll probably prefer Mozart over Aerosmith. That's probably because Aerosmith were the 'Future Villain Band' in that great film of 1978 - "Sgt Pepper's Lonely Hearts club band" - where they played their evil cover of 'Come Together'. 

 If you haven't already, please take:  The Music Recommendation Survey

 

Wednesday Sep 12, 2007

It is time to start thinking about  JavaOne 2008. The JavaOne call-for-papers is coming soon, so if you are doing something new and/or cool with Java technology, it is time to start thinking about that JavaOne talk proposal. Giving a JavaOne talk can be quite a bit of work (and a bit of stress too), but it is also a lot of fun - and an excellent way to get exposure for one's work.  Here's the talk I gave at JavaOne 2007:

Search Inside the Music: Using Signal Processing, Machine Learning, and 3-D Visualizations to Discover New Music

Tuesday Sep 11, 2007

Over  at last.fm, they are making a big push to clean up the world of music metadata.  A substantial fraction of the ID3 tags embedded in MP3 files are missing, inconsistent, or flat-out wrong.  last.fm has released a fingerprinter that they can use to resolve any track to a unique ID based upon the audio content of the track.   RJ, one of the founders of last.fm, indicates that in the end they may actually be able to find out how many ways there are to spell "Guns N’ Roses – Knockin’ on Heaven’s Door".  Just to give you an idea of how much fun this can be, here are the top 12 ways to misspell 'Guns N' Roses', according to MusicBrainz

Update: Check out the comments section, RJ has posted over 400 variants of GNR collected by the lastfm fingerprinter so far.


Guns N Roses
Guns and Roses
guns 'n' roses
Guns 'N Roses
Guns & Roses
Guns'N'Roses
Guns N'Roses
Guns'N Roses
Guns´n Roses
Guns N´ Roses
Guns -N- Roses
GNR


Guns N' Roses is pretty easy compared to Tchaikovsky.  MusicBrainz shows 86  aliases for the russian composer:

Tchaikovsky
Peter Ilyich Tchaikovsky
Peter Tchaikovsky
Tschaikowsky
Peter Tschaikowsky
Piotr Ilyich Tchaikovsky
Piotr Tchaikovsky
Peter Iljitsch Tschaikowsky
Tchaikovsky, Peter Ilyich
Peter Ilyitch Tchaikovsky
Pjotr Ilyich Tchaikovsky
Peter I. Tschaikowsky
Pyotr Tchaikovsky
P. I. Tchaikovsky
Peter Ilich Tchaikovsky
Tsjaikovski
Tchaikovsky, P.I.
Tchaikovsky, Pyotr Ilyich
Piotr Ilyitch Tchaikovsky
Tjajkovskij
Tsjajkovskij
Peter Ilyich Tchaikovski
Peter I. Tchaikovsky
Tchaikowsky
Tchaikovsky, Peter I.
Peter Ilyich Tschaikowsky
Pyotr Il'Yich Tchaikovsky
Peter Iljitsch Tschaikowski
Tchiakovsky, Pyotr Ilich (1840-1893)
Tchaikovsky, Peter Ilyich (1840-93)
Tchaikovsky, Peter Ilyitch
Tsaikovski
Pyotr Ilyitch Tchaikovsky
Pytor Ilyich Tchaikovsky
Pyotr Ilyich Tchaikovsky
Piotr Ilitch Tchaïkovski
Chaikovsky, P. I.
Pyotr Ilyich Tchaikovsly
Peter Ilyich Tchaikovshy
Tchaikovsky, Piotr Ilich (1840-1893)
Pjotr Iljitsch Tschaikowsky
Ciaikosvsky
Tchaikovsky 1841-1893
Pyotr Illyich Tchaikovsky
Tchaïkovki
Piotr Ilych Chaikovsky
Piotr Ilic Ciaikovsky
Pjotr Iljitsj Tsjaikovski
Pyotor Ilyich Tschaikovsky
Peter Iljitsj Tsjaikovski
P. I. Tchaikovskij
Piotr Ilich Tchaikovsky
Peter Iljitsch Tchaikovsky
Tchaikovisky
Tchaikovsy
Tchailovisky
Tchaikovskyes
Tchaikovskys
Tchaikoskvy
Piotr Il'yich Tchaikovsky
Tchaikowski
Piotr Illitch Tchaïkovsky
Tchaikovsky, Pjotr Ilyich (1840 - 1893)
Tchaikovsky, Peter Il'yich
Piotr Iljič Čajkovskij
Петр Ильич Чайковский
Tchaikivsky
Tchaikovsky, PI
Чайковский, Пётр Ильич
Pyotor Tchaikovsky
Чайковский, Петр Ильич
Tchaikovsky - Philharmonic Orchestra
Pytor Tchaikovsky
Pyotr II'yich Tchaikovsky
Piotr Ilyich Tchaikowsky
Peter Ilych Tschaikowsky
Peter Llyich Tchaikovsky
Peter Tschaikovsky
Peter Illyich Tchaiskovsky
Peter Iljitsch Tschaikowsky (1840 - 1893)
Tchaikovsky, Pyotr
Tsjaikovsky
Èajkovskij
Peter Tjajkovskij
P. Czajkowski
P. Tchaikovsky
Tchaikovsky Petr
Pyotor Ilitsch Tchaikovsky
Pytor Il'Yich Tchaikovsky
Peter Ilych Tchaikovsk




Surprisingly, MusicBrainz lists only 4 spellings for Britney Spears,  while  google suggests that there are 100s and 100s of alternatives

Monday Sep 10, 2007

The survey data is coming in.  So far I have over 80 responses with over 4,000 datapoints to use to evaluate recommendations.  This is very good data. Thanks to all who have submitted responses. Keep the data coming:  Take the survey

Sunday Sep 09, 2007

In the comments section, Ian raises some questions about the relevance of artist-level recommendations and why I'm interested in this type of evaluation.  Instead of burying my answers in the comments, I'll respond right here:

I think there is room in the world for both artist-level recommendations and track-level recommendations.  Clearly for sites like Pandora that are trying to give their users a traditional radio-like listening experience, quite a bit of attention should be paid to making good playlists, and this involves understanding music similarity at the track level, otherwise the listeners will get iPod whiplash from the Eleanor Rigby to Helter Skelter song transitions. There are many situations where track level recommendations are going to be best.

But I don't discount artist-level recommendations.  These are important for someone who is looking for new music.  Every month I head over to eMusic to try out a brand new artist - but I don't pick them randomly, nor do I pick the artist at the top of the charts, I'll use an artist level recommender to find an artist that I might like based on artists I already like.

For this study, since my goal was to compare (at least casually), various recommenders, I really had to use a lowest-common-denominator approach.  Most commercial recommenders will give you artist-level recommendations, or 'similar-artist' lists.  Only a very few give track level recommendations.  I decided to start with the simplest of recommendation tasks: artist-level recommendations based upon a single artist seed.  There are a dozen recommenders that support this type of recommendation, so it seemed to be the best task for a first comparison.  This does, however, leave out a number of recommenders that work at the track level (Pandora and Zukool for instance).  This doesn't mean that I think track-level recommendations are not important.

I am really interested in exploring the question "What makes a good recommendation?".  When I look at the various recommendations offered for the seed artist 'The Beatles' I am really forced to think hard about that question.  Is 'The Rolling Stones' a good answer to "If you like The Beatles you might like ..."? Surely anyone who has heard of the Beatles must also know about the Rolling Stones and has already made up their minds about them, one way or another, so it can't be a good recommendation.   On the other hand, seeing "The Rolling Stones" on a list of Beatles-like recommendations, gives me some confidence that the recommender is giving reasonable recommendations.  Questions of popularity, serendipity, familiarity, trust, and fashion all factor into people's sense of what makes a good recommendation.  I'd like to understand this at a deeper level.

With that here's what I am doing in this study:

  1. Collecting artist-level recommendations from a number of commercial and academic machine-based recommender systems for 5 seed artists
  2. Collecting artist-level recommendations from a number of professional music critics, for the same 5 seed artists
  3. Comparing the machine-based systems to see how well they agree with each other
  4. Comparing the humans to see how well they agree with each other
  5. Comparing the machine-based recommendations to the human created recommendations
  6. Collect data via a survey that ranks that quality of each recommended artist.
  7. Score the various recommenders based upon the quality of recommendations as scored in the survey.

There are all sorts of things that I am not measuring, mainly because it is hard to figure out how to measure it. Of particular importance are:

  1. Serendipity - it is unlikely that a survey taker, when encountering a recommended artist that they don't recognize in the survey will actually go and listen to the artist and then rate the quality of the recommendation.  No one is going to take the time to do that. But this is what we really want to measure - did we recommend something to you that was novel and interesting to you?
  2. Trust - to me, a good recommendation list will contain some artists that I already know, and some that I don't. For this survey, you are not seeing the individual artist lists, so you can't make an assesment of how reliable an recommendation list is.

The bottom line is that there are many, many factors that make a good recommendation, and it is not always easy to measure these things, but there are some things that we can do to start to understand which factors are most important.

I hope I answer at least some of your questions - and I appreciate input from many of the smart minds that read this blog on how to do this better.



Saturday Sep 08, 2007

Well, I learned my lesson ... don't use 'mailto' to collect survey results.  It is just not a reliable way to do it.  I've recoded the survey to use a real servlet that logs the results where I can collect them at my leisure.  This should work better now.  So, I encourage you to take the survey.

 Also, based on comments received, I'm including all of the seed artists in the survey instead of starting with just  'The Beatles'.
 

Friday Sep 07, 2007

In a few weeks, I'll be presenting (along with Oscar Celma), a music recommendation tutorial at ISMIR 2007.  As part of this tutorial, Oscar and I will be presenting various evaluations of commercial and academic music recommenders.  There are lots of ways to evaluate recommender systems, but when you get right down to it, probably the best way is to ask people what they think about them.   And so... we are conducting a survey and we hope that you will participate.  The survey is very simple:  you are asked to rate the quality of  recommendations that are of the form "If you like The Beatles you might like XXX".  The  survey will take less than 10 minutes to complete. If you are interested, then

Take the survey

The results of the survey will be used to rank various recommendations generated by commercial,  and academic recommender systems, as well as recommendations generated by professional music critics.    Surveys for recommendations based on other seed artists will be coming soon.

We are hoping to get 100s of replies so feel free to tell others about the survey. I shall be posting results after the conference at the end of the month.

Update: Well, it turns out using the 'mailto:'  on a form post is notoriously unreliable, so I spent a few hours this morning fixing it, so now things should work quite a bit better.

Wednesday Sep 05, 2007

I've had my iPod for 2.5 years. Its a 4th generation, 40GB iPod.  It is just about filled with music and podcasts.  I use it all the time, probably more than any other device that I own (well, except for my laptop).  I use it more than my car, my TV and my phone.  Even after 2.5 years, the battery works just fine, I get about 8 hours on a charge, which gets me through the day.

Last night, while taking my evening walk, the 'skip ahead' button stopped working.  I could skip to the next track, but I could no longer fast-forward through a track.  Well, I'll just have to replace it I thought ... which considering that new iPods are to be announced in 12 minutes,  I thought that was a pretty good thing. But, alas, after I rebooted my iPod, the 'skip ahead' button returned with full functionality.  And so, I have no excuse to upgrade today. Sigh.

 

Last night everyone was out of the house except for Jennie and me. Jennie is a 12 year old, and is probably the most avid music listener of my 4 children.  We have a computer connected to the stereo so I decided I would show her how last.fm worked.  We fired up 'high school musical' tag radio (if you have a pre-teen girl in your household, you probably know all about High School Musical),  pointed our web browser at Snapp Radio and had our own little high school musical party.  Snapp Radio did a fine job of selecting good photos (there are lots of high school musical concert photos on Flickr) - However, Snapp Radio did get a bit torn between showing photos of Vanessa Hudgens (aka Gabriella) and Rodrigo y Gabriella - the acoustic-metal duo.  All in all, it was a fun time, however, I now have a whole lot of tracks in my last.fm music listening history that I'd rather not have - Britney Spears, Ryan and Sharpay,, Troy, Get'cha Head in the Game. Hopefully this won't change what last.fm thinks of me too much.  (Despite it's indie leanings, I am not the lone High School Musical listener at Last.fm.  This week, 4 of the top 5 on the Artist Hype List at last.fm are High School Musical related).

 

Thursday Aug 30, 2007

Project Darkstar has just issued its first open-source release. Project Darkstar is the game industry’s first open source, enterprise grade, highly scalable, online game server.

This blog copyright 2010 by plamere