OSCON 2006: Ajax Optimization Techniques

    Kevin Yank
    Share

    This week, Kevin Yank is reporting from OSCON 2006 in Portland, OR.

    Kevin Henrikson of Zimbra gave a brisk presentation covering some of the lessons his organization has learned and the “dirty tricks” it has implemented to improve the performance of web applications that rely on large JavaScript/CSS codebases. Here’s a quick run-down of the items he covered. The slides of the talk are up on the Zimbra blog.

    First, a great tool for spotting problems: Web Page Analyzer will report on the relative sizes of the HTML, CSS, JavaScript, and images that make up any given page. Sometimes developers will work really hard to compress their images, only to serve hundreds of kilobytes of JavaScript code, and this tool will let you spot such issues in a hurry.

    Kevin’s first piece of advice was not to be afraid to combine multiple files into a single file. This works for both JavaScript and CSS, and although it doesn’t cut down on the size of the data, it can significantly improve load time because browsers will only request a certain number of files at once.

    As I’ve covered before, compressing JavaScript can be a tricky problem to solve. There are a lot of tools out there that are utter crap. But here are a few that work fairly well:

    • JSmin — aggressive, but doesn’t modify names
    • ShrinkSafe — less aggressive (more readable output), but reduces variable names
    • mod_gzip, mod_deflate — Apache modules that will compress web content before sending it to a browser that supports decompressing it

    Once you’ve optimized your JavaScript (and other code), you should make sure that code is properly cached on the client side. The key here is to know your HTTP headers, and the best way to do that is to get a tool like Tamper Data, a Firefox extension that lets you analyze the HTTP requests and responses going on behind the scenes.

    Kevin then went on to look at a couple of case studies he’s dealt with. The first centered around a high-profile blog post he wrote some seven months ago, AJAX and CSS Optimization, which criticized the newly-launched (at the time) version of Digg, the front end of which consisted of over 80% JavaScript code, which he managed to shrink significantly using the JavaScript compression tools mentioned above.

    His second case study was lala.com, the HTTP caching of which is immaculate, but the site requires the browser to download so many files that load time is slowed down significantly by the limit on the number of simultaneous requests. This is readily apparent when viewing the request timeline for the site using a tool like Tamper Data.