I just read the book High Performance Web Sites - which is an excellent O'Reilly book about how to improve the performance of your website by focusing on the frontend. This book gives 14 weighted rules to follow that will improve the load time of your web page. Reducing the number of HTTP requests, gzipping components, adding far-in-the-future expire headers all lead to pages that load much faster.

The author also describes a tool called YSlow that measures a number of performance aspects of your page. It reports the total load time, the page weight, a detail of component loading times, as well as an overall score that indicates how well optimized a page is. A score of A means that the website is doing all it can to eek out performance, while a score of F means that there is plenty of room for improvement.

I applied this tool to 50 well known Music 2.0 sites and recorded the front page load time, the YSlow score, the page weight (the amount of data downloaded) and the total number of http requests needed to download the page.  As you can see from the table, Music 2.0 sites have much room for improvement. The average load time for a Music 2.0 page is 6.6 seconds, Ruckus was the worst with a load time of over 30 seconds. Ruckus also has the lowest YSlow score of 30, showing that there are lots of things Ruckus can do to improve its page loading time. According to YSLow, for starters, Ruckus could combine its 23 javascript files into one, saving 22 very expensive HTTP requests.

Note that flash-heavy sites like Musicovery, MusicLens appear to do well here, but they've just shifted the loading times into a flash app that is not measured by YSlow.

name load time
(secs)

score


Page
Weight (K)

 HTTP
Requests

Amazonmp3 4.722
D (65)


325.3 57
Aol music 7.6
F (36)


306.9 91
All music guide 12.013
F (37)


479.2 113
Amie street 3.291
F (39)


576.3 51
Artistserver 5.537
F (44)


173.3 39
blogs.sun.com 2.4
F (58)


103.3 22
Cruxy 3.491
F (53)


707.3 43
Facebook 6.45
F (43)


197 110
Finetune 10.6
F (53)


129.6 22
Goombah 5.269
F (49)


185.2 33
Grabb.it 3.398
D (67)


77.4 32
Grooveshark 2.303
D (60)


254.4 30
Haystack 5.796
F (50)


495.7 71
iLike 4.71
D (62)


190.8 24
Lala 5.548
B (83)


33.8K 19
Last.fm 3.544
C (71)


408.3 59
MP3 Realm .802
B (83)


18.3k 7
Midomi 29.872
D (63)


238 34
Mog 14.712
B (80)


293.3 53
Mp3tunes 10.886
F (56)


439.9 45
MusicBrainz 7.8
F (44)


369 35
MusicIP 1.517
C (72)


80.3 11
MusicLens .965
A (96)


5 2
MusicMobs 2.372
D (62)


161.4 31
music.of.interest2.239
C (79)


105.2 7
Musicovery 1.315
A (92)


144.6 5
MyStrands 5.463
F (55)


222.5 33
Napster 3.876
F (27)


374.9 58
OWL Multimedia 3.963
F (36)


469.2 49
OneLLama 11.777
F (40)


341.4 59
Pandora 5.732
D (63)


957.5 17
QLoud 1.336
F (52)


206.7 38
Radio Paradise 2.445
F (55)


254 60
Rate Your Music 7.951
C (71)


634.9 71
Rhapsody 7.183
F (31)


782.1 121
Ruckus 34.039
F (30)


766.2 73
Shoutcast 2.452
D (61)


96.4 22
Shazam 10.442
F (56)


507.8 46
Slacker 16.803
F (48)


373.5 25
Snapp Radio 11.747
C (78)


38.6 8
Snapp Radio2 .986
B (85)


68.2 5
Songbird 5.636
D (61)


621.2 37
Soundflavor 7.879
F (44)


559.1 60
Spiral Frog 3.535
F (46)


407.6 90
The Filter 8.229
F (37)


710.3 61
The Hype Machine 6.395
F (48)


159.2 32
Hype Machine(beta) 3.92
F (56)


233.2 49
Tune core 5.945
F (47)


222.3 56
Yahoo Music 5.211
F (55)


481.6 35
Youtube.com 4.955
D (65)


204.2 65
YottaMusic 2.127
C (76)


51.3 42
ZuKool Music 4.046
D (67)


244.3 20

 I've also included SnappRadio (a mashup that I wrote a while back) along with a rewrite (snappradio2) that uses the Google web toolkit. One of the primary goals of the GWT is to make sure that your web app performs well. This is evident, as the load time for my app went from a laggy 12 seconds to a snappy one second.

Looking at this data, it looks like just about all of the music 2.0 sites could cut their page load times in half with just a few simple techniques.  Combining javascript code into a single file, adding far-future expiration headers to javascript and images - take little time to implement but can have surprisingly large positive impact on performance.  

I highly recommend the book along with the Yahoo's exceptional performance team's website. Both are filled with techniques and tools for improving web site performamnce.
 

Comments:

The most important aspect of optimizing front-end web development is making the entire user experience seem as fast as possible. At Grooveshark, we used a two-pronged attack to make the site as responsive as possible for our users (given, of course, that we’re still in beta and have a little while to go with our optimization work).

The first step is making pages load faster. On the front-end side, we accomplish this by using sprites for bullet-sized images, putting all of the javascript in one file, manipulating CSS and JavaScript positioning, and compressing files to decrease their size. One of the biggest killers of website performance is the number of HTTP request and for todays fast internet connection, setting up the request usually takes longer than downloading the actual file—spriting images and combining files really increase page loads because they both decrease the number of requests. We also found that when you compress those combined files, you achieve a better compression rate than when they were separated.

For actual JavaScript optimizations, Grooveshark tries to access the DOM as little as possible. For repeatedly used elements, caching drastically improves performance. Another way of decreasing DOM accesses is to put direct event handlers as attributes in the HTML itself. This breaks the barrier between logic and content, but the user only cares about fast, easy-to-use websites, and this helps take a big step in that direction. Another optimization technique that is not always obvious is to always put off any JavaScript action until absolutely necessary—this makes it so that any intensive page processing doesn't slow down the initial page load.

These techniques have served us well but we still have some more work to do before we’ll be content with optimization and load times. Reading up on basic code optimization is always a priority. Doing as much processing on the server as possible also helps in generating a faster experience for the user. Plus, any of these optimizations are useless without proper benchmarking, because optimizations are useless without any numbers backing them up.

As for the future of Grooveshark, there are still more improvements to be made, and new features will be developed that will add value to the user, while keeping any excessive loadtime away from the front-end experience.

Anyway, thanks for the writeup! That comparison chart already helped us gauge some good metrics about the frontend optimization of many of the other sites out there in the digital music scene. And hopefully next time you check us out, we’ll have a passing YSlow grade. Keep an eye out. :-)

Chanel Munezero
Web Developer
http://www.grooveshark.com/

Posted by Chanel Munezero on October 15, 2007 at 06:46 PM EDT #

I used YSlow when building the ZuKool front end for profiling and we had worked hard on optimization due to the very large download size of the front end web application.Interestingly I would get a B or a C for performance and you got a D? Perhaps that is due to your network bandwidth (my tests were on a 100mb fibre line common to Japanese homes)?

One point to bear in mind is that one of YSlows marking points is to use a content distribution network like Akamai and you are docked a mark for not using this. I am not sure of their current rates but this is probably only possible for sites with very high traffic and deep pockets. Without that point everyone might move up a notch. No excuse for Rhapsody however.

Another point to bear in mind, and could be added to your handy list, is what nature of site are you accessing. While my front end is very heavy in the first instance it is only to download once and being a web application, as opposed to a web site, there are no more pages and no more downloads (except updated content data) after the initial download.

A small word of caution to readers, if you are buying a web performance book make sure you read the date! I bought a similar O'Reilly book but failed to see that the publish date was 2002 and as such was desperately out of date, talking about the web 1.0 world of static text pages, ouch!

Posted by Ian on October 15, 2007 at 08:30 PM EDT #

Ian: Yep, Zukool gets a D (and still does). The YSlow benchmark is independent of actual download speed. Here's the report for zukool:

A 1. Make fewer HTTP requests
F 2. Use a CDN
F 3. Add an Expires header
B 4. Gzip components
A 5. Put CSS at the top
A 6. Put JS at the bottom
A 7. Avoid CSS expressions
n/a 8. Make JS and CSS external
A 9. Reduce DNS lookups
A 10. Minify JS
A 11. Avoid redirects
A 12. Remove duplicate scripts
F 13. Configure ETags

Posted by Paul on October 15, 2007 at 08:42 PM EDT #

Paul,

Thanks for running this test. I'm going to order this and a couple of other books and disperse them in my community and see if our next release can get us to a C or better. :)

Posted by Mayhem on October 16, 2007 at 01:18 PM EDT #

Post a Comment:
Comments are closed for this entry.

This blog copyright 2010 by plamere