Making getpersonas.com blazing fast

I love speed. I’m a speed fanatic. Fast cars, planes, skydiving, anything involving triple digit MPH. But the thing that I like fast the most is websites. Millisecond response times, content delivery networks, caching, sprites, you name it.

In case you didn’t already know, Personas (AKA ‘lightweight themes’) have been integrated into Firefox 3.6. With the impeding tsunami of traffic from millions of Firefox users headed towards getpersonas.com, I knew my time had come. Donning Firebug, YSlow and webpagetest.org, I set out to squeeze every last drop of performance out of it.

I knew my work would be painstaking, arduous and gut-wrenching, but I couldn’t let the fear of disappointing millions of Firefox users around the world stop me.

Minify

My first task: concatenation and minification of JavaScript and CSS. My weapon of choice: Minify.

Minify is a neat little PHP library that will take a list of CSS or JavaScript files, concatenate into one file, minify (remove whitespace and linebreaks) and gzip them. The plus is it doesn’t involve running any code before you push your files live, you can point your CSS & JavaScript URLs directly at Minify like so: http://getpersonas-cdn.mozilla.net/static/min/?g=css&r=59272. (g=css tells Minify I want the ‘css’ group of files)

The benefits of Minify are less HTTP requests and small file sizes due to minification and gzip. For getpersonas.com, 3 JavaScript files were reduced to 11.

Move JavaScript to the bottom of all pages

Next, in order to make getpersonas.com appear faster, I knew I had to move JavaScript to the bottom of every page.

From a previous test on webpagetest.org, it was apparent that users were waiting for the JavaScript in the head of the page to download before any other content. This is a well-known ‘feature’ of all web browsers. JavaScript blocks parallel downloading for all hostnames, period. The remedy? Move JavaScript to the end of the page.

Thankfully I hadn’t written any inline JavaScript, so moving the reference to the main JavaScript file to the bottom of every page was relatively simple. The benefit? Rendering of each page started 500ms earlier. While seemingly small, half a second is enough for a user to detect.

Add far-future expires headers

My next task? Adding far-future expires headers to all CSS, JS and images.

When visitors browse your website, they often view multiple pages that share common resources. Images, CSS, JavaScript, etc. Unfortunately, browsers need to check each file on every page load to see if it has changed. This requires many requests to the webserver, each incurring a non-trivial cost, sometimes in the hundreds of milliseconds. The method to avoid this is to use far-future expires headers.

Expires headers tell browsers how long they can store a file in their cache. If the expires headers tell browsers to cache a file for a long time, the browser doesn’t need to check to see if it has changed. This is what they look like:

Expires: Thu, 15 Apr 2020 20:00:00 GMT

In this case, the file doesn’t expire until April 15, 2020 at 8:00pm. Adding this to a .htaccess file in /static gave all our images proper expires headers:

<FilesMatch "\.(jpg|jpeg|png|gif|ico)$">
ExpiresActive On
ExpiresDefault "access plus 10 years"
</FilesMatch>

“But wait!”, you say. How will browsers know when the file has changed if it never checks for a new version? Easy, change the filename or append a query string. All persona images have a modified timestamp at the end of their URLs: http://getpersonas-cdn.mozilla.net/static/8/8/48888/preview.jpg?1260925626 and CSS & JavaScript have their SVN revision number instead: http://getpersonas-cdn.mozilla.net/static/min/?g=js&r=59113. (And Minify is configured to send far-future expires headers) For ‘regular’ images used throughout the site appending a new version # to the end of the url is easy enough.

For users, this means a faster overall site. Repeat view time decreased from 1.6 seconds to 0.7 seconds.

Use a CDN

Last but not least, we needed to move our content closer to users around the world. How? A content delivery network (CDN).

A big problem with the Internet is people all around the world use it. Sometimes from thousands of miles away from your webserver. Fiber optic cables under the ocean may be fast, but 50 or 100ms of round-trip time adds up when users need to download dozens of files. Short of inventing faster-than-light communication, the next easiest thing to do is move your content closer to your users.

Instead of buying our own servers and copying files to them, we’re using a CDN for getpersonas.com. A CDN has servers set up around the world and has done all the hard work for us already 🙂 .

All our images, JavaScript and CSS are served from the CDN’s servers. The way we achieved this is by setting up a separate hostname that points to them (getpersonas-cdn.mozilla.net). When a browser requests a file from getpersonas-cdn.mozilla.net, the CDN fetches it from our webservers, caches it (because of our far-future expires headers 🙂 ) and serves it to the browser.

Since our CDN has servers in multiple locations around the world, users download files faster and with better latency.

Results?

This post wouldn’t be worth reading without some pretty charts and graphs now would it? (Results according to webpagetest.org, a great performance tester)

Summary:

  • 500ms earlier rendering start time
  • 300ms shorter download time (most likely even faster for users across the globe)
  • >2x increase in repeat view speed

Before:

Document Complete Fully Loaded
Load Time First Byte Start Render Time Requests Bytes In Bandwidth Time Requests Bytes In
First View 3.617s 0.792s 1.763s 3.617s 26 305 KB 885.23 Kbps 3.617s 26 305 KB
Repeat View 1.638s 0.418s 0.906s 1.638s 26 10 KB N//A 1.830s 26 10 KB

before

After

Document Complete Fully Loaded
Load Time First Byte Start Render Time Requests Bytes In Bandwidth Time Requests Bytes In
First View 3.312s 0.780s 1.306s 3.312s 25 318 KB 1.03 Mbps 3.312s 25 318 KB
Repeat View 0.730s 0.430s 0.534s 0.730s 2 8 KB N//A 1.153s 2 8 KB

after

One more thing:

For an overview of how much we’ve sped up getpersonas.com since October 2009, here’s a speed test from then (before a site redesign):

Document Complete Fully Loaded
Load Time First Byte Start Render Time Requests Bytes In Bandwidth Time Requests Bytes In
First View 5.225s 0.782s 2.937s 5.225s 27 487 KB 898.34 Kbps 5.460s 27 487 KB
Repeat View 61.028s 0.433s 13.087s 61.028s 27 27 KB N//A 61.028s 27 27 KB

before redesign

That’s a decrease of two seconds in download time and rendering starts 1.7 seconds earlier. Not bad, eh?

[1]: An additional JavaScript file was added between tests shown at bottom of post.

Special thanks to Stephen Donner, Krupa Raj, Jeremy Orem and Matthew Zeier for their hard work in making getpersonas.com blazing fast.

5 responses

  1. Hesse wrote on :

    Insightful.
    Thanks for sharing your thoughts on this.

  2. Blake Cutler wrote on :

    Wow, this is awesome! Nice work Ryan.

  3. Michael wrote on :

    Thanks for the post! I like the graphs and tables. Congrats on the speedup!

  4. Thomas wrote on :

    “Move JavaScript to the bottom of all pages”

    jQuery was used on getpersonas.com as Javascript-Framework. Usually JS-Frameworks include a ‘domready’ Event, that is fired as soon as the DOM becomes usable. Images don’t have to be loaded for ‘domready’ to be fired.

    So far I didn’t tell you anything new, but my question is now, when moving JS to the bottom of the page, wouldn’t all images be loaded before your JS, making domready fire only after the images have loaded?

  5. Alain wrote on :

    I noticed some images aren’t included in the main ‘sprite’ image. If you’re trying to go for ‘blazing fast’, every HTTP request should count.

    A quick test with Google PageSpeed also revealed the CSS has quite a few inefficient CSS selectors.

    I’m not trying to downplay your achievements, I wish more people/websites would put the same effort into their site efficiency.