The HTTP Archive Report has published their end-of-year technology statistics which collate information from 300,000 of the web’s most popular websites. Average page weight has bloated by 32% in one year to reach more than 1,700Kb — or 1.7Mb — and now comprises 96 individual HTTP requests. It’s worse than the staggering 30% increase in 2012!

Some of the increase can be explained by increased ecommerce activity and advertising as people hunt for gifts. However, few web sites lose weight in January and continue to gorge themselves throughout the year.

The report analyzes publicly-accessible content and shopping web sites rather than complex web applications. It provides a breakdown of the specific technologies used:

technology end 2012 end 2013 increase
HTML 54Kb 57Kb +6%
CSS 35Kb 46Kb +31%
JavaScript 211Kb 276Kb +31%
Images 793Kb 1,030Kb +30%
Flash 92Kb 87Kb -5%
Other 101Kb 205Kb +103%
Total 1,286Kb 1,701Kb +32%

The rise in HTML is fairly negligible although it’s slightly surprising given the trend for cut-down content and simpler, flatter designs. 57Kb is quite chunky for just content.

CSS sizes have increased by 11Kb on average. Some could be explained by Responsive Web Design and CSS3 effects, but a reduced requirement for vendor prefixes should have helped?

However, any rise in HTML and CSS can be offset by a decrease in JavaScript code. There’s less reason to use large script libraries now we have better browser consistency and CSS3 animations. That’s not happened and the average page now loads 18 individual script files; concatenation and minification would help immensely.

Unsurprisingly, Flash has dropped by a few kilobytes and pages using the plugin have fallen from 37% to 32%. Advertisers remain the primary users but HTML5 alternatives are starting to appear now Responsive Web Design is a mainstream technique.

“Other” files have doubled in size. Almost a third of this growth can be attributed to webfonts and webfont icon sets which is acceptable given that it should lead to a reduction in image use … except it hasn’t. Perhaps high-density photographs can justify some increase, but who is loading a megabyte of images on every page?

The figures are more shocking when you consider they’re averages. Approximately half the web sites analyzed will be more obese. We web developers should hold our heads in shame.

The Reasons

What can we blame? My primary suspects are:

  1. Bloated CMS Templates
    Typical WordPress themes are crammed full of features. Many will be third-party styles and widgets the author has added to make the theme more useful or attractive to buyers. Many features will not be used but the files are still present.
  2. HTML5 Boilerplates
    A boilerplate may save time but it’s important to understand they are generic templates. The styles and scripts contain features you’ll never use and the HTML can be verbose with deeply-nested elements and long-winded, descriptive class names. Few developers bother to remove redundant code.
  3. Carelessness
    Developers are inherently lazy; we write software to make tasks easier. However, if you’re not concerned about the consequences of page weight, you should have your web license revoked.

Even if we forget website SEO, software efficiency and user responsiveness, one in five web visits is from a phone. On the most efficient mobile network a 1.7Mb page will take one minute to download — assuming the phone or tablet is able to render it effectively. Would a potential customer be prepared to wait?

Mobile connectivity and bandwidth continues to improve but it rarely jumps 30% in one year. It’s ironic that developers are willing to adopt RWD techniques while making the same website unusable on the devices they’re targeting.

I’m appalled. Admittedly, I started development in the dial-up days when 100Kb was considered excessive, but are today’s web pages seventeen times better than they were back then?

Will web page weights ever reduce? Is your site clinically obese? How did it get into that state?

Tags: efficiency, filesize, SEO, size, weight
Craig is a freelance UK web consultant who built his first page for IE2.0 in 1995. Since that time he's been advocating standards, accessibility, and best-practice HTML5 techniques. He's written more than 1,000 articles for SitePoint and you can find him @craigbuckler

Free Guide:

How to Choose the Right Charting Library for Your Application

How do you make sure that the charting library you choose has everything you need? Sign up to receive this detailed guide from FusionCharts, which explores all the factors you need to consider before making the decision.


  • http://uk.linkedin.com/in/karlbrownactor Karl Brown

    I don’t think it’s so much carelessness on the part of developers as much as time considerations for development, especially on large sites. Projects typically never have any time set aside to “declutter” or remove redundant code. Up against tight deadlines it’s typically easier, and cheaper, to add the new code and do nothing with the old code if you don’t have to; this would explain the increase in CSS file size as even if the HTML that would be affected is no longer there, a developer won’t always have time to find all the CSS and remove it. My guess would lean very much towards projects not allowing time to remove old code unless that’s the project’s main purpose; that’s not carelessness on the part of a developer but simply not having the time before delivery to do what we would all agree should be a necessary part of the job.

  • paulm

    Craig. Agree with you. I’ve been using wp for over 6 years now and have noticed the endless battle of balancing quality v performance. Due to this and increase in visitor expectation I have had to either build my own vps nginx solution or use flywheel or wpengine for hosting.

    • Craig Buckler

      I don’t think WordPress or any other CMS are necessarily to blame — they’re just serving content. The problem seems to come from shoddy templates and bloated plug-ins which bulk up the page.

      • http://cannabissense.com/ Will Stevens

        I agree, a lot of themes/templates are full of bloat, and also the ease of using a CMS lowers the bar for entry to developers who may not be able to hard-code but can troubleshoot a CMS and install a new widget or plugin (along with it’s additional scripts) each time they want a new function. I use Joomla for example, and every new project begins with an install of the core CMS, followed by me disabling 1/3 to 1/2 of the core components and plugins that will never be used. I am then very selective about how I approach added functionality. And many of the CMS frameworks out there allow for serious functional tweaking, but the ease of just downloading a widget instead of investigating how to use the current framework for certain tasks is the easy way out, and therefore is likely to be the most common way for today’s developers to do things.

  • http://jitendravyas.com/ Jitendra Vyas

    One of the reason is CSS frameworks too. Like Bootstrap

  • David Miller

    I certainly agree with the gist of what you are saying, but I have to ask, where is this :” On the most efficient mobile network a 1.7Mb page will take one minute to download” coming from?

    1.7 mb in a minute? That is rather slow for even a mediocre mobile network these days, certainly not representative of the best.

    • Sai Pc

      to download 1.7 megabytes in 1 minute, you will need a ‘so-called’ 8-12 Mbps connection…afaik, mobile networks are definitely not faster than that…are they? i completely agree with the article, its just plain bloat and no one is taking steps to reduce their page sizes, or atleast optimize partially for mobile phones…

      • David Miller

        An 8-12 mbps connection should be able to download 1.7mb in a matter of seconds, not a minute.

        This hardly needs debating, think about it. If the average website is 1.7 mb and takes 1 minute to load, then that’s what we should be experiencing on average. I can’t even remember a website ever taking 1 minute to load, and my mobile network is nothing special. So unless everyone else been regularly waiting a minute for average website visits and I’ve lucked out, the number is wrong.

        • Sai Pc

          8 mbits /s => 1 megabyte/s, on a mobile network due to interference and signal strength, you would rarely get anything above 200-300 kbits/s…especially in crowded areas and major cities. Most people do not have anything above a 2-4 Mbps connection, in my country we dont even have 8Mbps fixed lines lol, but leaving aside all that, it is simply an ‘average’, which means that there are both higher and lower page weights… maybe the sites you are visiting are optimized, or you frequent those sites and they are cached… as a fun fact, the very page we are on right now, is 859.6 KB uncompressed and uncached…since some of the js components are cached already, the net page weight for me is 376.1 KB…if a simple article page can be almost an MB, i wouldnt doubt it at all that avg.page size is 1.7…;) PS: i used YSlow to check…

        • Craig Buckler

          We are considering mobile in this context. I used the same rate Sai mentioned — around 300kps which is 300,000 bits per second (not bytes). I rarely get anywhere near that and don’t expect to until 4G becomes a widespread, affordable reality.

          Obviously it depends which sites you’re using. Since you’re a developer, I guess it’s tech-savvy sites which I hope were lower than 1.7Mb and using best-practice techniques to ensure content appears quickly.

  • Jingqi Xie

    Kilobits, or kilobytes?

  • Jingqi Xie

    Average is affected a lot by maximums. So were they a webpage with 1GB of download and 1,023 blank webpages, there would be 1MB on average.

  • Anh Fam

    Are you serious now? The reason for the increase in weight despite the improvement in CSS3 compatibility is obvious: Old browsers doesn’t go away. They are still there, and we still have to cater to the users who still use them. With hacks and polyfills and the like.

    • Craig Buckler

      I don’t agree. You don’t need hacks these days and, presuming you use progressive enhancement, you rarely need polyfills or shims either. Besides, even they would add no more than a few dozen Kb of JavaScript.

    • DF

      To make css3 compatible with older browsers takes a few lines of code – hardly a footprint. I wouldn’t consider that a factor.

  • Rob

    While all applications of these newer technologies are different, I often wonder if adding extra code is truly increasing performance.

  • http://twitter.com/refreshcreation Ryan Carson

    IMHO A lot of the general site bloat is directly related to the rise of wordpress, jquery, joomla, etc. An awful lot of UX on websites caters towards the visual styling and, as years go on, less about accessibility, unless it is already ingrained in the CMS.

    Page and site bloat are two major things I take into account when designing websites still and I can happily state that my home page falls into the smaller file-size side of things, despite having a few larger images in a rotator.

    I still consider it to be on the larger side of things though, it currently weighs in at ~500kb but I have had some horrible designs sent to me involving usage of 24bit pngs for all main site images due to customer requests, resulting in pages up to 4MB after compression and adjustment for improving site speed rankings.

    Perhaps we need an annual competition to make a stylish webpage suing the basics html, css, javascript and some imagery, with a small upper limit in terms of filesize say 50 – 100kb to get smaller websites back in-vogue?

  • HenriHelvetica

    There’s so much to consider. Recall when ppl went on diets but were still drinking large glasses of OJ that had loads of (albeit it natural) sugar? The same may apply.

    CMSs are getting heavy – that’s for sure. Some do abuse frameworks – yes, using 2 features, and installing the full package when going live (like Bootstrap). I’ve been looking at weights of late, and I’ll see a bunch of img files that have not been optimized – shockingly.

    But there will always be a page weight vs site design battle. We’re constantly given tools to do more, but more than ever are being reminded of pw. Just need to find a good and sensible middle ground.

  • http://tobto.org/ seo freelancer

    more ‘quality content’ – more hi-res images, kb, retina ready pics, etc. In fact, nothing bad, if you impress your customers with top notch original captures of your activity. Big issue if your audience sleep with a responsive Teddy Bear.

  • Craig Buckler

    So companies cannot afford conscientious development but are happy to lose up to one in five customers? Isn’t it up to developers to educate clients? They may not listen or care but, if they want a badly-built site, they can hire a bad developer.

    • http://uk.linkedin.com/in/karlbrownactor Karl Brown

      To an extent, but particularly in large companies where the website is only part of the business it doesn’t always work that way. Typically the “client” will give a list of projects to their business partner, whose sales people/project managers will say can be done in x months. It’s far too common for the developers to be brought in when it comes to the low level design phase, which in my experience is far later than it should be for feedback from the developers to affect timelines. I’ve also found, especially in large businesses, that multiple projects are ongoing at any one time so the developers will either 1) be working in silos, or 2) be spread thinly across the multiple projects so their attention is never able to be fully on project A, X or Z.

      Also, unless the project is specifically aimed at reworking the code structure of the site then more often than not developers are working with legacy code, being told “No, you cannot change the HTML” (as an example) even if that code has been there for 5 years and is woefully outdated – often they’re told no because the costs for the project have been agreed and it would take an additional couple of days to bring some parts of the code up to modern standards. To take CSS files, I’ve seen it time and time again where it’s easier to simply tack on the code you need to the end of the existing style sheet rather than trawl through tens of thousands of lines of code that have accumulated over the years. Add to that what I already mentioned about multiple projects being developed simultaneously, and the teams could very well be writing similar code but because the developers have been silo’d by their bosses, they haven’t had the chance to see all the projects running successfully in a test environment so by the time they get to a point where they’ve seen it and can see where they can make savings, it’s too close to the go-live date for the projects and there’s no chance left to optimize and reduce the code size.

      Also on the CSS front, particularly with large websites, the preferred way of working is to get the look and feel the way some key stakeholders want it, in the cheapest way possible. One case I know of that was achieved by using a piece of existing functionality in a different way; it gave the look and feel but bloated the page (which I remember telling them it would); look and feel and quick turnarounds won the day over page weight and performance.

      I’m not saying that the way it works is the best way to do it, but the realities on the ground mean that far from the developers being “bad” or “careless” they’re being pushed into working on projects that should have longer time frames, are working with systems that they’ve inherited and had to reverse engineer to get anywhere, and are working on sites that are so bloated (and that clients refuse to spend the money on the bring it up to modern standards and improve performance) that the only thing they can ever do is what’s paid for.

  • DF

    In response to #3 (web debs are lazy) – what you failed to take into account is that most of us freelancers are working under a budget and time constraints. It takes time to trim the fat and the majority of clients will not want to pony up for it. I’d have a really hard time convincing a client to spend another 10-15% to tighten up the code and shave some seconds off page loading time when he’s already happy with end result – and it was a hard enough sell to get their budget to move to pay for the “sloppy but invisible to client” coded website.

    Now if I’m working full time for a company and the company needs to keep me busy with a constantly improving website, then the work makes more sense.

    In the real world of freelancers, you gotta make the client happy within budget (the budget itself is a strong factor in that client’s happiness) then move on to the next
    project.

    I’m not trying to completely discount the value of trim code. But I think it’s one of the last things on most client’s minds. And the customer is always right :-)

  • George

    Been saying this all along. Lazy programmers make for bad code.

  • Francisco Alvarez

    In my personal experience the problem most of big sites have is third party tools. All leading websites are using more and more analytics, printing more ads, and sharing content with a million different social networks. While you can develop good performant code, you’ll always have requirements that kill your page load.
    It is a total shame, but it’s also what makes $$ on the web.

  • Mailer
  • jrista

    Hmm, very interesting stuff. I am not surprised by the increase in JavaScript size. Scripting has become a much more important aspect of web development, as it is more effective than it used to be years ago. We are developing web sites to be more like rich apps thanks to HTML5 and CSS3, so an increase in functional code is expected.

    I guess I am not all that surprised that the total image size has increased as well…visuals are still very important, and when it comes to video and/or photography, larger image sizes has become a very big trend. Just look at 1x.com, 500px.com, and Flickr.com…all three sites have capitalized on “HD” designes, and the recommended image upload sizes have jumped…from around 750-900 pixels wide to 1920 pixels wide. If the study performed involved any of the three above photography sites, that could very well account for a 30% increase in average image size. Many new web site designes, even flat ones, are often using very large images in the upper segment of the landing page (or even each page), and many of those designs use some kind of slider or rotator to cycle through content (each tab of which has a different high resolution image.)

    I am surprised by the small increase in HTML. Does that only account for the initial HTML page download, or does it also include asynchronously downloaded content (i.e. AJAX)? I know that these days, I pull down partial views more frequently than I used to. While my initial page download might be 50k, I’m sure I bring down another 15-30k in partial views to fill in regions of my pages that have dynamic content.

    As for CSS size, I am also not really surprised by that either. Is it really true that there is less need for vendor prefixes? I still seem to NEED to throw in -webkit-, -moz-, -o-, and -ms- despite very much not wanting to. When it comes to animations, vendor prefixes explode, and my CSS ends up with four to five times as many lines for CSS animations than should really be necessary. Yet, one way or another either I myself or a customer finds a reason to keep them in…because they want to make sure that one particular browser for that one particularly important customer is supported.

    I don’t foresee the need for vendor prefixes going away enough that we can reduce the size of our CSS files for a few more years yet at least…sad as that realization is.

  • DF

    I think #1 (bloated templates ) and #3 (developer laziness – or lack of knowledge) should be one and the same, and probably account for 90%+ of the cases. I see it all the time – graphic designers promoting themselves as developers, who really only know how to take a WordPress template and modify a few graphics here and there. The template is still referencing all of the scripts that aren’t even being used.

  • DF

    I agree with this 100%. Most clients that I deal with, I’d have a hard time selling them on a 20% increase in budget for me to go through and clean code and tighten up code. Most are only concerned with what they can see. Shaving off a second or two from page load is not something they are going to notice or wasn’t too pay extra for. Now if you can convince them that they will rank higher in the search engines and turn more profit, then you may have a possibility.

    • Jay76

      I work in web analytics, and it is (sometimes) possible to show an increase in conversions when pages load faster. Even a half second decrease can have a noticeable impact if the audience is large enough.

      Of course, you need to convince the client to take that initial investment before you can measure it, but it does have some impact.

Special Offer
Free course!

Git into it! Bonus course Introduction to Git is yours when you take up a free 14 day SitePoint Premium trial.