With the looming deadline for JavaOne talk submissions, I finally got off my duff, stopped procrastinating and submitted my JavaOne talk proposal.  Here's the proposal:

Search Inside the Music 

Using signal processing, machine learning and

3D visualizations to discover new music 

In this talk, we present Search Inside the Music, a
research project in Sun Labs that is exploring new ways
to help people to discover new music even as our music
collections get very large.

As online music collections grow to many millions of
songs, finding a new song that we might like is becoming
very difficult.  The Search Inside the Music system can
help us find new music by finding music that *sounds like*
music that we already know and like.

SITM, written entirely in the Java programming language,
uses digital signal processing and machine learning
algorithms to build a music similarity model that
can predict how similar or dissimilar a pair of songs
sound. SITM uses this music similarity model to recommend
music by finding music that sounds similar to music that
you already know and like.

Not only can we use this music similarity model to help
recommend music, we can use the model to generate a more
engaging, immersive interface to our music.  SITM uses
the music similarity model to generate a 'music space'
- a 3-dimensional representation of a music collection,
where songs are positioned in this space based upon music
similarity.  In this music space, Classical music may be
clustered in one corner trying to stay as far away from
Punk music as possible, while Blues finds a home near, but
separate from Jazz and Rock. This visualization encourages
music exploration.  New songs can be auditioned by clicking
on a song in the visualization. Similar sounding songs can
be found by clicking on a song's neighbors.  Interesting
playlists can be generated by creating paths through this
music space.

In this talk we discuss some of the problems inherent in
traditional music recommenders, and how a content-based
approach to music recommendation can help improve music
recommendations.  We discuss some of the algorithms
involved in building a music similarity model including the
digital signal processing algorithms used for extracting
music features and the machine learning algorithms
for identifying significant patterns in the music. We
discuss some of the algorithms used to generate immersive
interactive visualizations of a music space using the
Java3D API.  Finally, we present a demonstration of the
Search Inside the Music system.
 

Comments:

Post a Comment:
Comments are closed for this entry.

This blog copyright 2010 by plamere