OSCON 2006: Ajax Optimization Techniques

Tweet

This week, Kevin Yank is reporting from OSCON 2006 in Portland, OR.

Kevin Henrikson of Zimbra gave a brisk presentation covering some of the lessons his organization has learned and the “dirty tricks” it has implemented to improve the performance of web applications that rely on large JavaScript/CSS codebases. Here’s a quick run-down of the items he covered. The slides of the talk are up on the Zimbra blog.

First, a great tool for spotting problems: Web Page Analyzer will report on the relative sizes of the HTML, CSS, JavaScript, and images that make up any given page. Sometimes developers will work really hard to compress their images, only to serve hundreds of kilobytes of JavaScript code, and this tool will let you spot such issues in a hurry.

Kevin’s first piece of advice was not to be afraid to combine multiple files into a single file. This works for both JavaScript and CSS, and although it doesn’t cut down on the size of the data, it can significantly improve load time because browsers will only request a certain number of files at once.

As I’ve covered before, compressing JavaScript can be a tricky problem to solve. There are a lot of tools out there that are utter crap. But here are a few that work fairly well:

  • JSmin — aggressive, but doesn’t modify names
  • ShrinkSafe — less aggressive (more readable output), but reduces variable names
  • mod_gzip, mod_deflate — Apache modules that will compress web content before sending it to a browser that supports decompressing it

Once you’ve optimized your JavaScript (and other code), you should make sure that code is properly cached on the client side. The key here is to know your HTTP headers, and the best way to do that is to get a tool like Tamper Data, a Firefox extension that lets you analyze the HTTP requests and responses going on behind the scenes.

Kevin then went on to look at a couple of case studies he’s dealt with. The first centered around a high-profile blog post he wrote some seven months ago, AJAX and CSS Optimization, which criticized the newly-launched (at the time) version of Digg, the front end of which consisted of over 80% JavaScript code, which he managed to shrink significantly using the JavaScript compression tools mentioned above.

His second case study was lala.com, the HTTP caching of which is immaculate, but the site requires the browser to download so many files that load time is slowed down significantly by the limit on the number of simultaneous requests. This is readily apparent when viewing the request timeline for the site using a tool like Tamper Data.

Free book: Jump Start HTML5 Basics

Grab a free copy of one our latest ebooks! Packed with hints and tips on HTML5's most powerful new features.

  • Pingback: Ajaxian » Ajax Optimization Techniques Talk (OSCON 2006)

  • Pingback: Delusional Media » OSCON 2006: Ajax Optimization Techniques

  • Rangi Keen

    Another tool that is worth checking out is Fiddler (available at http://www.fiddlertool.com/). It integrates nicely with IE, but can work as a proxy for any browser. It gives you feedback similar to TamperData and includes a pie chart for the various MIME types (HTML, JS, CSS, etc) comprising a list of requests.

  • http://www.rideontwo.com z0s0

    Packaging up your JS files into a single bundle is a really good idea. I’ve seen a massive increase in load speed by doing so. Firefox seems to only display the page once all the JavaScript is loaded, which contributes to the importance of this.

    Interestingly Cal from Flickr wrote a great article about this, and yet Flickr has no less than 17 (!!) seperate .JS includes on the home page alone. Little wonder Flickr feels so slow to me.

  • Dave Grijalva

    One key thing to think about with jumbling all your files together is the ballance between the number of files and the amount of redundant code sent to the client. For an application like flickr, every view has code that’s specific to it. It makes sense to bundle as much code as possible into a single file, but anything that’s page specific should be left out.

    The goal is to minimize load times (especially of the first page) by intelligently grouping blocks of code into single files, making smart use of caching, and not sending the client view specific code until they need it.