By Paul Annesley

Faster Page Loads – Bundle Your CSS and Javascript

By Paul Annesley

Have you ever watched your status bar while you wait for a page to load and wondered why several files seem to be downloaded before you see anything at all on your screen? Eventually the page content displays, and then the images are slotted in.

The files that keep you waiting are generally the CSS and Javascript files linked to from the “head” section of the HTML document. Because these files determine how the page will be displayed, rendering is delayed until they are completely downloaded.

HTTP Overhead

For each of these files, an HTTP request is sent to the server, and then the browser awaits a response before requesting the next file. Limits (or limitations) of the browser generally prevent parallel downloads. This means that for each file, you wait for the request to reach the server, the server to process the request, and the reply (including the file content itself) to reach you. Put end to end, a few of these can make a big difference to page load times.

Example Case

As a demonstration, I’ve created an empty HTML document which loads 5 stylesheets. Browser cache is disabled, and I’ve used Firebug, an invaluable tool for any web developer, to visualise the HTTP timing information. Oh, and I’ve omitted the DOCTYPE declaration for brevity only – no real sites were harmed in the writing of this article.

<title>Apache / PHP</title>
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />

Firebug HTTP timing visualisation for CSS from apache

A total load time of 2.24 seconds. For an empty page.

Lets see what happens if we request the same stylesheets from lighttpd, a lighter weight web server which is not being weighed down by PHP. It’s worth noting that the use of a separate domain for static content means any cookies set on the domain will not be sent with the HTTP requests. Smaller requests are faster requests.

<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />
<link rel="stylesheet" type="text/css" href="" />

Firebug HTTP timing visualisation for CSS from lighttpd

We’ve shaved off half a second, but I suspect the bulk of the time is spent going back and forth to the server for each file. So how do we improve the load time further?

Static Content Bundling

Maintaining small, modular stylesheets can make development easier, and make it possible to deliver styles specifically to the pages that require them, reducing the total CSS served per page. But we are seeing now that there’s a price to pay in the HTTP overhead.

This is where you gain performance from server side bundling of static content. Put simply, you make a single request to the server for all the CSS files required for a page, and the server combines them into a single file before serving them up. So you still upload your small, modular stylesheets, and the server is capable of delivering any combination of them as a single file.

Using the technique described in an article at (PHP script also available there), we can handle special URLs that look like this:,two.css,three.css

For, I wrote a bundling implementation in Lua for lighttpd/mod_magnet so that we can still avoid the overhead of Apache/PHP. When a combination of files is bundled for the first time, the bundle is cached on the server so that it only needs to be regenerated if one of the source files is modified. If the browser supports compression, lighttpd/mod_compress handles the gzipping and caching of the compressed version.

Lets check our load time now..

<title>lighttpd bundled</title>
<link rel="stylesheet" type="text/css" href=",standard.css,format.css,index.css,promo.css" />

Firebug HTTP timing visualisation for bundled CSS

The Final Result

560ms! That’s 1.6 seconds quicker than the original example.

This bundling technique works for CSS and also JavaScript.. but obviously you cannot mix the two in a single bundle.

  • Great stuff!

    I was going that far that I even considered making a build script that would combine all JS and CSS files in one file to save time on additional HTTP requests, but this technique is much better. Thanks!

  • Andy T

    I recently reduced total pages sizes for a javascript-heavy application by bundling all the JS files and using gzip compression. The page was previously over 250K but is now about 30K, making it usable on dial-up.

  • chrisb

    How is this any better than just using shrinksafe (js) or similar shrink tools?

  • @chrisb: removing whitespace or using other more complicated javascript compression methods will reduce the file sizes, but not effect the HTTP overhead of requesting multiple files. You can shrink the files before uploading, and then also use this bundling technique to get the best of both worlds.

  • chrisb

    Apologies, should have clarified my point – most standard compression tools allow the option of ‘bundling’ as part of their pre-processing.

    My point was meant largely as since whitespace/comments/etc removal is gonna be necessary anyway, whats the point in having scripts worrying about caching the various files and having to monitor filesystem for changes…

  • I agree that in many scenarios, a workflow and build process which includes pre-bundling css files would be great. For “out of the box” software with a release cycles and versioning, there’s little excuse not to do the bundling and other optimizing during packaging.

    However, this server-side bundling method is easy to drop in to an existing system without changes to your workflow – just some minor changes to your markup.

    Most of all, I think I like the granularity that server-side bundling affords. Any combination of CSS files can be requested by any page, so each page (hopefully page-controller of some sort) controls exactly which bits of CSS/JS are downloaded.

    This means that on that ever-critical first load of your site (first impressions really do last), regardless of the landing page, you can know the the user didn’t wait to download unused CSS or Javascript that was part of a pre-packaged bundle. On a site with millions of page views this can make a big difference, and decent stat() caching in the server kernel can make any filesystem monitoring expense minimal.

  • JdL

    Can using @import achieve the same effect as combining the separate files in the URL (,two.css,three.css)

    For example, EDS.COM uses the following technique in screen.css:

    @import “layout.css”;
    @import “nav.css”;
    @import “page.css”;
    @import “forms.css”;
    @import “color.css”;

  • malikyte

    @JdL: I just tested it in Firebug. Nope. I count 9 CSS files, and the imported files are fetched separately as well. Even if they’re imported within one stylesheet, the server doesn’t know this and will serve the CSS to the browser, which will then determine it needs to go and grab 5 more CSS files, each in turn. It took Firebug 1.69 seconds to load all of the files on my connection.

  • That’s right, @import will not save on HTTP requests.

    However, I have noticed that using @import actually causes Firefox to download the CSS files in parallel rather than one by one. This reduces the critical path before rendering time to (initial stylesheet) + (largest stylesheet), which would often be an improvement over n*stylesheets.

    However, I believe Firefox enforces a limit of 4 connections per hostname, and Internet Explorer only two per hostname, so this benefit would only scale so far. Also, other browsers may behave differently altogether.

    Here’s a screenshot of parallel downloads via the following import statements:

    <link rel="stylesheet" type="text/css" href="/import.css" />

    And import.css:

    @import "";
    @import "";
    @import "";
    @import "";
    @import "";

    Firebug screenshot

  • JdL

    @Paul Annesley:

    From the looks of that screenshot, it actually does reduce the time down to very close to your solution, despite there being 7 requests. It also shows that 5 files are being downloaded simultaneously–is this related to the connection limit? (or have you been playing with About:Config?)

  • Yes – I currently have network.http.max-persistent-connections-per-server increased from 2 to 10. The standard limit of 2 per server would still offer better parallelism than separate links in the head of the document.

  • Alin

    but aren’t the js and css cached by the browser after the first load?

  • JdL

    @Paul Annesley: I’m still intrigued, confused, and curious.

    In your second test (with 8kb of requests), you record a download time of 1.67 seconds. After you have implemented the script that allows multiple files in the same request, you record a download of 560 milliseconds for the same 8kb of files. When you tested the @import in a CSS file, you record 580 milliseconds for 8kb of files, but this time you show the files being downloaded simultaneously because you have your browser configured to support more connections.

    Was the 1.67 second test also with a larger-than-default max-persistent-connections?
    What about the 560 millisecond test?
    Can you run the @import test again, only this time using the default network.http.max-persistent-connections-per-server; once for IE 6 / 7 and once for Firefox?

    This is a cool discussion and a worthwhile study. Thanks for bringing it up, Paul!

  • malikyte

    Although I agree it’s a worthwhile study, the fact of the matter is that we can’t rely on best-case scenarios. Using the techniques presented in the article itself will help with the worst-case scenarios.

    JdL – how can we run the tests for IE6/7 when the tool used to diagnose the issue was a Firefox plugin? I suppose Fiddler could be used to monitor all of them, but I personally haven’t used this tool as a HTTP request profiling tool and proxy (which would be needed to make it work with Firefox).

  • Simply using a single css file is the best option for now. You can separate the modules with in the css file and call that css once. I think the possibility of using more modular css does’nt arise in most scenarios.

  • As an oldtime tuner, I would do neither multiple files or on the fly bundling.

    My viewpoint is that the manual effort made by a developer to combine the files *once* is repaid many times over in saved cpu cycles. This is true of many things which could be done on the fly.

    The closer you get to 100 percent static content, the faster you get.

    If you really must, you can make the bundling part of an automated build and release process.

  • Simply using a single css file is the best option for now. You can separate the modules with in the css file and call that css once. I think the possibility of using more modular css does’nt arise in most scenarios.

    I think you’re right for most scenarios. Only a small percentage sites need as much CSS and require the ongoing development that something like SitePoint does.

    But running all of our CSS in a single file would be pretty difficult to manage. Some CSS is shared across the whole site (i.e structure.css) but others is specific to sections (i.e. books.css). Other CSS is specific to particular pages (i.e. promo.css is the CSS for the cover page book promo).

    This allows us some flexibility in what CSS gets sent, but keeps the overheads low.

  • Drew

    @Alin: but aren’t the js and css cached by the browser after the first load?

    I’d have to agree, every one seems to be gleening over the fact the browser cache reduces the overhead to practically nothing. I understand there may be scenarios where you don’t want the brwoser to cache it, but I’d think it would be the exception rather than the rule.

  • Nicolae Namolovan

    Can you share the Lua code what does this ?

  • hessodreamy

    Am I missing something? The first example shows 23KB of files taking 2.24 seconds to download, but the second example only 8KB is downloaded, taking 1.67 seconds.
    The methods discussed will reduce HTTP overhead, but surely not bandwidth (significantly).
    But I’m probably missing something, aren’t I?

  • hikarateboy

    Forgive me if my understanding of all this is not to the level of the rest of you.

    It’s my understanding that this technique only benefits static pages by reducing the number of http request for supporting js and css files and I can see that logic in that.

    But say the page is a dynamic php page wouldn’t doing an include() or require() for these files within your header
    accomplish a similar result since total size is the same but only one call to the server would made? It would seem to me that the extra load on the server to do this would be pretty much immeasurable.

  • hessodreamy

    hikarate, I think your comment goes hand in hand with an earlier comment about caching.
    For dynamic content, you’d expect the content of each page to differ but, to a large extent, css and js to be the same.
    Using external css or js (whether individual files or bundles, as described here), the browser should cache the file and therefore only download it once.
    If you output js/css in the head of each document, the browser won;t be able to rely on a cached version. Hence you’ll be downloading a lot more data overall.

  • netspade

    Useful tip. One would expect HTTP pipelining to have a similar effect though…

  • dave@fms

    Seems like a great idea, but I’m having trouble using it. I am trying to implement it within a CakePHP environment, and I am having trouble with what I believe is the mod_rewrites. Has anyone else tried & succeeded in integrating it to cake, or is a mod_rewrite guru, and can find the mistake? I’ll list the current file contents below. I have tried moving the 2 added lines for combine.php (bolded) up in the order, and tried setting the RewriteBase, but nothing has seemed to work.

    <IfModule mod_rewrite.c>
        RewriteEngine On
        RewriteCond %{REQUEST_FILENAME} !-d
        RewriteCond %{REQUEST_FILENAME} !-f
        RewriteRule ^(.*)$ index.php?url=$1 [QSA,L]
        RewriteRule ^css/(.*.css) combine.php?type=css&files=$1
        RewriteRule ^js/(.*.js) combine.php?type=javascript&files=$1
    AddType application/x-javascript .js
    AddType text/css .css

    Any help would be appreciated!!

  • dave@fms

    umm….took out the two combine lines:

    RewriteRule ^css/(.*.css) combine.php?type=css&files=$1
    RewriteRule ^js/(.*.js) combine.php?type=javascript&files=$1

  • dave@fms

    I take it back, the problem doesn’t seem to be the mod_rewrite, it works if I only put one file at a time, the resulting file is gzipped/cached/returned. However, when I put more than one file, seperated by the ‘,’, cakephp chokes. I supposed compression/caching is better than nothing, but I would like to have it work correctly…any thoughts?

  • JohnH

    Don’t know why but the site now displays without the bundled style sheets applied when i try accessing from work using IE6, IE7 or Firefox.

    IE gives the following error:
    Line: 2
    Char: 1
    Error: Syntax error
    Code: 0

    I’m accessing through a Proxy, which may have something to do with it, but this site has always loaded correctly before. Any ideas ?

  • jeff

    All I can do is laugh at you people. Pathetic.

  • Inspired by this article I’ve done a plugin for Ruby on Rails that does something similar:

    Rails plugin: Blazing fast page loads through bundled CSS and Javascript

  • Thought it was worth mentioning that HTTP requests aren’t just for css and js files. Every image, swf, or any other file that you use on your page (including your page itself) is yet another request that adds a tiny slice to your overall load time. The difference is that most of them will happen after your page is at least showing something, but you should still keep it in mind if you decide to split up images that sit right next to each other, or put 4 different flash files on a page, etc…

    I use
    Firefox WebDeveloper Toolbar > Tools > View Speed Report
    to check stuff like this and keep my pages snappy. The first thing it shows you is the total number of HTTP requests, with an estimation of .2 second lag time for each one. Plenty of other useful information on there too, along with recommendations for speeding your pages up.

    @jeff: You’ve demonstrated quite a high level of technical prowess there. I agree it’s pretty pathetic that you have nothing to add to the discussion.

  • co.rab

    It would be nice to see not just one number as the result of the total page load, especially if the difference is just some ms. What about a series of tests, standard deviation? It should be just a short description but at least it would be more convincing. If you do several times a refresh you can see that these numbers can vary a lot, also on a local machine that contains both client and server.

    Does someone know a good article / book about measuring web site performance in total? (email me) Considering all the different parts of a web app, from server side performance, data transmission performance, client side JS performance/load and so on. There are a lot of things to consider when looking at the performance of the WHOLE application and things you can optimize.

  • ScrappyTexas

    Anybody done this with .Net rather than the PHP method?

  • JdL

    @Paul Annesley:

    I’d like to see the results of the same tests, but using the @import directly into HTML such as the following:

    <style type="text/css">
    @import url("css/main.css");
    @import url("css/print.css");
    @import url("css/screen.css");
    @import url("css/minimal.css");
  • Right now Sitepoint is displaying without any CSS at all. This happened last week as well.
    It seems is down. I guess that is the risk of having some files loaded from a different domain/server to the content.

  • @cranial-bore: Thanks for the report, however is not offline.
    I’ve followed this up with you via email.

  • Eviltechie

    It seems to me that most of this effort is pointless. In almost all cases CSS, JS, and images are cached by the browser after the first view. Thus, when viewing a second page on the same site, it shouldn’t be making more requests for CSS or JS files, and should only request the images it hasn’t already cached.

    For high traffic sites, forcing every page to include all of your CSS and JS could greatly increase your bandwidth usage.

    For the tests you’ve shown to be valid, you should compare first page load times with subsequent page load times. Using your method, they’re all going to be the same. Allowing the browser to cache things you’ll see the first page taking longer to load, and subsequent pages loading much faster.

  • bart

    I’ve noticed that Firefox 2 does not cache js and css files at all. I’m using firebug to verify this. If you hit refresh none of the files are cached. Only when you navigate back to the page you came from will you see that images were cached and js and css were not. Can anyone verify that this is default behavior in FF or is this some setting I turned on and forgot about. Thanks.

  • @eviltechie
    I don’t think it’s pointless. It’s true what you say about caching for future page loads, but the point of this is to speed up loading for someone’s first impression of your site. Regardless if you think first impressions aren’t valuable enough to warrant the extra work, that’s the point of the article.

    Also why do you say subsequent loads using this method won’t cache? It’s the exact same request for pages that use the same combination of files, so why wouldn’t it? If another one is added in the mix on another page you would still have to download it before it would be cached anyways, so I don’t see any difference in downloading the new bundle for that page.

  • bart

    OK. I think I was wrong in the last comment. It seems that Firebug may be a bit misleading for displaying the css and js files as not cached. When you click to see the headers of the css and js files there is a timestamp. If you navigate through the site and the timestamp doesn’t change, then I guess it’s safe to assume that FF cached the files. I also tried something else. To verify I cleared browser cache and then input about:cache in the address bar in a new tag. Both memory cache and disk cache were 0. Then I refreshed the page in question and refreshed the about:cache tab. The correct number of requests was displayed in both memory and disk caches. When you drill down you’ll see that the files are in one of the caches.

  • @Eviltechie: The bundle will certainly be cached by the browser after the first load – we’re certainly not forcing every page to include all CSS and JS.

    @bart: Also keep in mind that clicking the reload button in Firefox is different to visiting the page again – you’re telling Firefox that you want to re-download the content.

  • It seems to me that most of this effort is pointless. In almost all cases CSS, JS, and images are cached by the browser after the first view. Thus, when viewing a second page on the same site, it shouldn’t be making more requests for CSS or JS files, and should only request the images it hasn’t already cached.

    That’s assuming they get as far as looking at a second page. Many people will typically be loading 3,4 or 5 tabs simultaneously, and if one of those hasn’t started rendering a usable page in a few seconds. they’ll click the ‘X’ and concentrate on the other tabs that have.

    A user hasn’t received anything of value for you at that point so they have no perception of loss if they leave. That makes the first ever page load far more important than any future page load.

    Even if the user knows your site but hasn’t visited for a few months (so no cached version) getting them the first page load quickly is critical to getting them to re-commit to you and not an alternative site. This is equally true whether you’re running a blog, a forum, an online store or a content site.

  • ramesh effic

    can i set proxy page until original page is downloaded from server

    in jsp

  • David

    Hmmm, old thing I’ve found on the Internet, but I need to say that any effort to speed things up for the end-user is good effort. That said, the idea is great!

  • Christian Winther


    I have created such a script for ligghtpd (using mod_magnet and lua). It can be found at here:

    Faster page loads: Bundle your css and javascript with lighttpd and mod_magnet (lua)

  • John

    @ Paul,

    I request you to answer one thing that I am not able to understand.

    @Eviltechie: The bundle will certainly be cached by the browser after the first load – we’re certainly not forcing every page to include all CSS and JS.

    My question is as follows: A webpage A links to a1.css, a2.css, and a3.css. These are bundled into one request (, a2.css, a3.css). These are bundled and compressed and sent to the browser, lets say as bundle B1. The browser caches the bundle B1. The user navigates to another webpage (on the same website) webpage B, which links to a1.css, a2.css, and a4.css. In this case, all three css files are downloaded again as bundle, lets say B2, though a1.css and a2.css are already present inside bundle B1 cached on the client machine. Are we creating a bandwidth overhead??

    Please answer as soon as possible! Thanks in advance!

  • Paul Annesley

    Hi John,

    Nearly two years later… :)

    Yes – if each of your pages uses a different combination of stylesheets, then the stylesheet bundle for each new page will be a browser-cache miss on the first load. So yes – in some cases it will cause redundancy of downloaded styles, causing slightly more data to be downloaded.

    However without the bundling, there would be at least one cache miss anyway, for the non-cached individual CSS file – and possibly multiple misses.

    There’s no silver bullet, you just need to find the right balance. I’d almost always take a single slightly larger download over lots of small requests. And if you take this into account when layout out your CSS, it’s especially effective.

Get the latest in Front-end, once a week, for free.