Saturday Oct 06, 2007

The results for the Music Recommender Turing Test challenge are in!  In this challenge, I posed the question - "Can you tell which music recommendation was generated by a human, and which was generated by a machine?".   Two recommendation lists were generated as the response to the question "If you like Miles Davis, you might like?".   One list was generated by a professional music critic, the other by an algorithm develop here in the labs.  The lists generated were:

Recommendation List A: 

If you like "Miles Davis" you might like:

  1. John Coltrane
  2. Duke Ellington
  3. Thelonious Monk
  4. Charlie Parker
  5. Herbie Hancock
  6. Bill Evans
  7. Charles Mingus
  8. Sonny Rollins
  9. Wayne Shorter
  10. Weather Report

 Recommendation List B: 

If you like "Miles Davis" you might like:

  1. John Coltrane
  2. Weather Report
  3. Herbie Hancock
  4. Wayne Shorter
  5. Charlie Parker
  6. Dave Douglas
  7. Chet Baker
  8. Tony Williams Lifetime
  9. Can
  10. Sly & the Family Stone

 The challenge received 29 responses, of which 19 had direct predictions (the others opined about the quality of the recommendation or the difficulty of the challenge).  Of the 19 predictions, the results were quite skewed, 15 indicated that they thought that list A was created by a human, while 4 thought that list B was created by a human.  Some respondents were quite sure in their predictions: 

"Well, this is too easy...
A = human
B = machine
no doubt in my mind.

Others were less sure:

I bet B is machine generated, although it's a tough one to call

One interesting thing is that although the clear majority thought that list A was created by a human, those that indicated a preference, seemed to prefer list B.  One commenter said  "B has differences that wouldn't show up in a human's list because a human will stick to the genre. That said I think I like B more."

The Results

To summarize:  19 indicate that list A is a human, while 4 indicate that list B is a human.   The actual source of the lists are:

  • List A - a machine - an collaborative filtering system developed in Sun Labs based upon last.fm listener data
  • List B - a human - a professional music critic named Dominique.

80% of the respondents got the answer wrong.

BTW, Dominique - the music critic responded to the comments as such: "yes, I am most certainly a robot!  I like the guy who commented about miles' different eras -- one of the hardest parts about recommending based on him is trying to figure out what aspect/period of miles davis people are responding to."

This was a really fun exercise for me. The resulting dialog provided many insights into what it takes to make a good recommendation.  And I find the results to be quite surprising, that a rather traditional recommendation algorithm can mimic a human enough to convince a significant majority that it is not a machine.  Perhaps this bodes well for machine recommendation.  (Of course this experiment is too small and too casual to draw any long term conclusions).

And the Winner is ...

Of the 4 correct answers, I selected one at random to receive my extra copy of "Net Blogs and Rock 'N' Roll".  The winner could have been David Jennings (the author of the book), which would have been an interesting twist, but the coin wavered and sent the winning book to JuniorBonner.  Congrats Junior.

 

Friday Oct 05, 2007

Today is your last chance to try to get a free copy of Net, Blogs and Rock 'n' Roll.  To qualify for the book, take the Music Recommendation Turing Test, be sure to include your email address in the proper comment field so that I can contact you. Today at 2PM ET, I'll chose an entry at random from the set of correct answers to receive the book.  

If you have already taken the Turing test, good for you, you are already entered.  If you answer the Turing test more than once, I'll use your last answer as your definitive answer.  And as MaryMary says: This is not a contest. I am giving away things that I own to people that I choose using stamps that I buy at the post office.  Good Luck!

Thursday Oct 04, 2007

There are lots of music recommenders out there, each one vying for
our attention, trying to connect us to our next favorite band. But not all
recommenders are created equal. Some have the insights and good taste of a
Bill Goldsmith or a John Peel, while others seem to have as much insight
about what I might like as an iPod on shuffle play. But which
recommenders are the best, and which are the worst? And how do these music
recommendations compare to what a professional would generate? Can machines
compete with humans in what is clearly a question of taste? A few months
ago, I decided to try to answer these questions - to find out what is the
best music recommender, and to find out if machines can compete with humans
when it comes to recommending music. Over the next few days, I shall be
blogging what I've discovered.

The tough question

The hardest question to answer is 'How to evaluate a recommendation?'.
A typical recommendaton starts with a seed artist, so for instance
a recommendation may be of the form "If you like The Beatles you may
like.. " followed by a list of recommended artists. Now, when I look a
music recommendation based on an artist that I really know, I can get
a feel for the quality of recommendation. In the list of recommended
artists, I expect to see some artists that I already am familiar with -
probably bands that I like. Even though these recommendations may
be obvious, they help me gain some trust in the recommender. If I
ask for a recommendation based on my affinity for Miles Davis and
a recommender suggests John Coltrane, I get a warm feeling that the
recommender is on the right track. It is a relevant (albeit somewhat
obvious) recommendation. I also expect to see some artists that I've
heard of but am not too familiar with, and I also expect to see some
artists that I've never heard of. So for me a good recommendation
contains a mix of the familiar (to help me gain trust), and the novel
(to help me find new music). Now of course, if the recommended artists
are off track ('if you like Miles Davis you might like Paris Hilton'),
no amount of familiarity or novelty is going to turn the recommendation
into a good recommendation. The recommendations need to be relevant.
To summarize, a good recommendation has three aspects:

  • familiarity - to help us gain trust in the recommender
  • novelty - without new music, the recommendation is pointless
  • relevance - the recommended music has to match my taste

Lets look at a couple of recommendation as examples. Here's a set of
similar artist recommendations for The Beatles from last.fm. According to
last.fm, if you like the Beatles you might like:

  • The Rolling Stones
  • The Who
  • John Lennon
  • Led Zeppelin
  • Queen
  • Beach Boys
  • Doors
  • David Bowie
  • Kinks

Well, last.fm certainly gets the 'familiar' and the 'relevant', but
there's nothing novel here. Any fan of the Beatles has no doubt heard of
all of these bands, no great discoveries will be found in this selection.

Compare those to this set from Dominique, a professional music critic.
According to Dominque, if you like the Beatles you might like:

  • Paul McCartney/Wings
  • John Lennon
  • Harry Nilsson
  • Queen
  • George Harrison
  • ELO
  • Raspberries
  • Badfinger
  • XTC
  • The Millennium

This list has the familiar and relevant McCartney, Lennon, Harrison
and Queen as well as some artists that a typical Beatles fan may not
have heard of: The Raspberries and Millennium. Dominique seems to have
restricted his recommendations to bands from the late sixties and early
seventies. Not an unreasonable choice, but it does perhaps reduce the
novelty aspect of the recommendations.

Compare those to this set from Chris, another professional music critic.
According to Chris, if you like the Beatles you might like:

  • Chuck Berry
  • Harry Nilsson
  • XTC
  • Marshall Crenshaw
  • Super Furry Animals
  • Badfinger
  • The Raspberries
  • The Flaming Lips
  • Jason Faulkner
  • Michael Penn

So there's the familiar with Chuck Berry and Harry Nilsson, but I have to
pause and think for a bit about relevance - there's no Rolling Stones or
The Who to give me a warm fuzzy feeling, Chris gets to the novel artists
right away - high on novelty scale but not high on the familiarity scale
(at least for me).

And finally here's a set of similar artist from the All Music Guide.
According to All Music, if you like the Beatles you might like:

  • Hollies
  • Searchers
  • Peter and Gordon
  • Monkees
  • Gerry and the Pacemakers
  • Bee Gees
  • Zombies
  • Dave Clark Five
  • Remains
  • Sorrows

This list seems fairly balanced, there's the familiar (Monkees, Bee Gees,
Dave Clark Five, and the Hollies), they've avoided the cliches (no Rolling
Stones, Queen or the Who) and have some bands that I'm not too familar
with including a band called 'The Sorrows' who are described as "one of
the most overlooked bands of the British Invasion" which certainly looks
like something that'd be fun to listen to if I like the Beatles.

With these four examples, I've tried to show the three elements
that I look for in a good music recommendation: familiarity, relevance
and novelty. The difficulty is, of course, that no two people will agree
exactly on what is familiar, relevant or novel. What I find novel, may be
familiar to a professional music critic, while a musician may find musical
relevance when I don't. The difficult job for a recommender, whether
it is human or a machine is to find the proper level of famililarity,
novelty and relevance for each person.

Finally, if we really want to evaluate a number of recommender systems,
to compare the quality of recommendations, we have to figure out a good
way to turn the very subjective measures that we've looked at here into
a set of objective measures that we can use to score recommendations.
Part 2 will look at some of the objective ways we will use to evaluate the
various music recommenders.

Last week I caught a glimpse of the new book written by Meinard Müller,  called Information Retrieval for Music and Motion.  Meinard is a member of the Multimedia Signal Processing Group at Bonn University, working as a researcher and lecturer. Meinard has always been one of my favorite ISMIR speakers.  His work  on score alignment is particularly fascinating. 

The publisher of this new book, Springer-Verlag, describes it thus:

This monograph details concepts and algorithms for robust and efficient information retrieval by means of two different types of multimedia data: waveform-based music data and human motion data. It first examines several approaches in music information retrieval, in particular general strategies as well as efficient algorithms. The book then introduces a general and unified framework for motion analysis, retrieval, and classification, highlighting the design of suitable features, the notion of similarity used to compare data streams, and data organization. The detailed chapters at the beginning of each part give consideration to the interdisciplinary character of this field, covering information science, digital signal processing, audio engineering, musicology, and computer graphics.

 

Wednesday Oct 03, 2007

Last.fm recently rolled out a new feature: Similar Tracks.  Last.fm will tell you what fans of a particular track are also listening to.  The similar tracks lists are created using the last.fm collaborative filtering algorithm (people who listened to X also listened to Y)  Last.fm seems to filter out all tracks by the same artist as the seed track so you end up with an interesting set of tracks from different artists.  Here's the similar tracks last.fm gives for Hey Jude:

  • The Who –  My Generation
  • The Rolling Stones – Sympathy for the Devil
  • The Who – Pinball Wizard
  • The Rolling Stones – Paint It Black
  • The Kinks – Lola
  • The Kinks – You Really Got Me
  • The Animals – House of the Rising Sun
  • Creedence Clearwater Revival – Down on the Corner
  • Steppenwolf – Magic Carpet Ride
  • The Beach Boys – Barbara Ann
  • Norman Greenbaum – Spirit in the Sky
  • The Hollies – Butterfly
  • The Hollies – Rain on the Window
  • Manfred Mann – Hymn (From Jupiter)

And for Stairway to Heaven:

  1. Queen –  Bohemian Rhapsody
  2. Eagles – Hotel California
  3. Deep Purple – Smoke on the Water
  4. The Doors – Riders on the Storm
  5. Aerosmith – Dream On
  6. The Who – Pinball Wizard
  7. The Who – Won't Get Fooled Again
  8. Queen – We Will Rock You
  9. Dire Straits – Sultans of Swing
  10. The Rolling Stones – Sympathy for the Devil
  11. Deep Purple – Child in Time
  12. Lynyrd Skynyrd – Free Bird
  13. The Rolling Stones – Angie
  14. Aerosmith – Walk This Way

It doesn't look like there's any way to play  'similar tracks' radio right now, so you can use the similar tracks lists for exploring but not for playlisting. 

Update:  RJ  points out that the similar tracks data is also available via their webservices.  The similar tracks XML also includes a similarity score that indicates how similar the tracks are.

Tuesday Oct 02, 2007


The Phantom
Originally uploaded by PaulLamere.
While taking a walk through Vienna, I noticed a camera crew and a car underwraps... looks like the Rolls-Royce Phantom a $400K car.  The 450 horses compare favorably to the two in the vehicle next to it.

A month ago, I pre-ordered a copy of David Jennings' book. Shortly afterward, David's publisher was kind enough to send me a copy.  Well, the Amazon copy arrived yesterday, so I am now the proud owner of an extra copy, which I'm going to give away on this blog.  To qualify for the book, follow these instructions:

On Friday, October 5, I'll chose an entry at random from the set of correct answers to receive the book.  

If you have already taken the Turing test, good for you, you are already entered.  If you answer the Turing test more than once, I'll use your last answer as your definitive answer.  And as MaryMary says: This is not a contest. I am giving away things that I own to people that I choose using stamps that I buy at the post office.  Good Luck!

Monday Oct 01, 2007

Oscar, my co-tutor for the music recommendation tutorial has posted the Music Recommendation Tutorial Website with supporting information for the music recommendation tutorial.   On the site you will find PDFs of the slide deck (all 250 of them), the the tutorial proposal that we submitted to the program committee back in April (I knew I was signing up for a lot of work, but I didn't know exactly how much) and some other bits and pieces.

Oscar and I would love to hear some feedback on the tutorial - the good and bad bits. (Yes, we had too much material to cover in too little time, apologies for talking so fast).  We'd also like to continue the conversations about what makes a good recommendation - in particular the 'relevance + novelty' idea - and how it can be measured.  So if you have thoughts about this, or comments about the tutorial, please add to the comments of this post.

 Finally, let me say it was great fun working with Oscar - he's smart, easy going and just filled with good ideas.  But Oscar learned an important lesson - don't over stuff yourself on Weiner shcnitzel, the night before  a big talk ... you'll not be too happy the next day.  Here's Oscar, an hour before the tutorial wondering if he'd make it through ...


The big surprise at the ISMIR (The International Conferences on Music Information Retrieval) dinner was the announcement of the location of the next 3 conferences.   Everyone already knew that ISMIR 2008 will be held in Philadelphia. But we didn't know that ISMIR 2009 is to be in Kobe Japan and ISMIR 2010 will be in Utrecht.  Fantastic!

Saturday Sep 29, 2007


Vienna Cafe
Originally uploaded by PaulLamere.
I discovered the Vienna Cafes this week. Unlike a Starbucks, these cafes are places that you can linger. You can talk, work, read or just hangout. It really feels like a different century in some of these. I was expecting to see Sigmund Freud walk in at any moment. It is all about pace - and the Vienna Cafe is nice and slow.

Friday Sep 28, 2007

I'm staying in a hotel that charges 6 Euro for 3 hours of internet time.  Luckily, the google reader added an offline mode, so I can suck down 2000 blog posts in a minute and leisurely read them while offline. Google is saving me money. Very nice!

Next Saturday I'll be travelling to Montreal during the Pop Montreal festival.  I'm sitting on a panel entitled  Recommendation Engine: Conceptualizing the Universe. along with Sandy Pearlman (record producer), Doug Eck (University of Montreal, Music+Machine learning expert) and Brian Whitman (The Echo Nest).

The description of the panel is:

How a full bore recommendation engine would reliably lead anyone in the world to all the music in the world they would love if they only knew it existed. With an aggregate of over one billion songs – as opposed to titles now available somewhere on line (according to Cache Logic), implementing recommendation engineering navigation, capable of creating a taste based “trend line” populated by as many songs, version and/or title that one may specify, simply as a function of a few user inputted descriptors, would impose individually generated guided availability upon what would otherwise prove an ungraspable chaos of music overload. How will recommendation engines rise to the challenge? Some of the world’s leading technological gurus, futurists, licensing library reps and recommendation engine leaders discuss one emerging trend in the future of music

Now I don't know what any of that means, but I still think the panel would be a good look at what is going on in the world of music recommendation, so stop on by if you are in the area. More info on the Pop Montreal page

Thursday Sep 27, 2007


CIMG5299
Originally uploaded by Peter Chow.
Last night Elias and Gwen brought a gang of us to the Vienna Opera. For only 2 Euros, you can get a space in the gallery to stand and watch the opera. The gallery is the standing area behind the last seats on the top most balcony. It is an incredible value, since main house seats can cost 150 or 200 Euros. We watched about the first 45 minutes of the Barber of Seville. I really enjoyed the show - the singing and orchestra was amazing, the acoustics were excellent (no amplification), and the audience was well behaved. We left before the fat lady sang because we really only wanted a taste of the opera, and after standing for much of the day we were tired and wanted to sit. Elias and Gwen then brought us to a traditional underground Vienna restaurant, and then to a quite cool Vienna cafe. It was a great night of great conversations. Much thanks to Elias and Gwen for organizing the night out.
Yahoo Research's Malcolm Slaney presented a paper at ISMIR on Wednesday about how they are using user rating data to create song similarity data. Yahoo is in the enviable position of having billions of user-taste data points about music.  This data, naturally, can be used to generate item to item similarities that would be extremely useful as ground truth for any number of MIR tasks.  Malcolm's motivation for the talk was to propose an alternative to the rather time-consuming and painful  process of human evaluations that are used in the music similarity task in MIREX.  Malcolm presented a rather traditional item-to-item collaborative filtering system - nothing new in the approach, I was hoping that at the end Malcolm would say that they are giving a big wad of the taste data or the item-item similarity data to the MIR community, but alas, Malcolm says that it is just too hard to give away such data - especially after the AOL shared data fiasco of last year.  

The Listen Game - a game that is used to collect semantic annotations of music has a nice write-up in Dr. Dobbs.  The money quote:

"When my mom gets up in the morning and is like, 'I need some energy to go jogging,' she has no clue what title or artist is going to help her with that," said Gert Lanckriet, the electrical engineering professor overseeing the project.

What Lanckriet's mom needs is a "Google for music" -" a search engine for music that lets you type in regular words like "high energy instrumental with piano, " "funky guitar solos" or "upbeat music with female vocals, " and get songs in return.

 

This blog copyright 2010 by plamere