Average Page Weights Increase by 32% in 2013


The HTTP Archive Report has published their end-of-year technology statistics which collate information from 300,000 of the web’s most popular websites. Average page weight has bloated by 32% in one year to reach more than 1,700Kb — or 1.7Mb — and now comprises 96 individual HTTP requests. It’s worse than the staggering 30% increase in 2012!

Some of the increase can be explained by increased ecommerce activity and advertising as people hunt for gifts. However, few web sites lose weight in January and continue to gorge themselves throughout the year.

The report analyzes publicly-accessible content and shopping web sites rather than complex web applications. It provides a breakdown of the specific technologies used:

technology end 2012 end 2013 increase
HTML 54Kb 57Kb +6%
CSS 35Kb 46Kb +31%
JavaScript 211Kb 276Kb +31%
Images 793Kb 1,030Kb +30%
Flash 92Kb 87Kb -5%
Other 101Kb 205Kb +103%
Total 1,286Kb 1,701Kb +32%

The rise in HTML is fairly negligible although it’s slightly surprising given the trend for cut-down content and simpler, flatter designs. 57Kb is quite chunky for just content.

CSS sizes have increased by 11Kb on average. Some could be explained by Responsive Web Design and CSS3 effects, but a reduced requirement for vendor prefixes should have helped?

However, any rise in HTML and CSS can be offset by a decrease in JavaScript code. There’s less reason to use large script libraries now we have better browser consistency and CSS3 animations. That’s not happened and the average page now loads 18 individual script files; concatenation and minification would help immensely.

Unsurprisingly, Flash has dropped by a few kilobytes and pages using the plugin have fallen from 37% to 32%. Advertisers remain the primary users but HTML5 alternatives are starting to appear now Responsive Web Design is a mainstream technique.

“Other” files have doubled in size. Almost a third of this growth can be attributed to webfonts and webfont icon sets which is acceptable given that it should lead to a reduction in image use … except it hasn’t. Perhaps high-density photographs can justify some increase, but who is loading a megabyte of images on every page?

The figures are more shocking when you consider they’re averages. Approximately half the web sites analyzed will be more obese. We web developers should hold our heads in shame.

The Reasons

What can we blame? My primary suspects are:

  1. Bloated CMS Templates
    Typical WordPress themes are crammed full of features. Many will be third-party styles and widgets the author has added to make the theme more useful or attractive to buyers. Many features will not be used but the files are still present.
  2. HTML5 Boilerplates
    A boilerplate may save time but it’s important to understand they are generic templates. The styles and scripts contain features you’ll never use and the HTML can be verbose with deeply-nested elements and long-winded, descriptive class names. Few developers bother to remove redundant code.
  3. Carelessness
    Developers are inherently lazy; we write software to make tasks easier. However, if you’re not concerned about the consequences of page weight, you should have your web license revoked.

Even if we forget website SEO, software efficiency and user responsiveness, one in five web visits is from a phone. On the most efficient mobile network a 1.7Mb page will take one minute to download — assuming the phone or tablet is able to render it effectively. Would a potential customer be prepared to wait?

Mobile connectivity and bandwidth continues to improve but it rarely jumps 30% in one year. It’s ironic that developers are willing to adopt RWD techniques while making the same website unusable on the devices they’re targeting.

I’m appalled. Admittedly, I started development in the dial-up days when 100Kb was considered excessive, but are today’s web pages seventeen times better than they were back then?

Will web page weights ever reduce? Is your site clinically obese? How did it get into that state?

Free Chapter! HTML5 & CSS3 for the Real World

Get a free chapter of SitePoint's new book, the second edition of our popular HTML5 & CSS3 for the Real World and receive updates on our latest offers.

  • paulm

    Craig. Agree with you. I’ve been using wp for over 6 years now and have noticed the endless battle of balancing quality v performance. Due to this and increase in visitor expectation I have had to either build my own vps nginx solution or use flywheel or wpengine for hosting.

    • Craig Buckler

      I don’t think WordPress or any other CMS are necessarily to blame — they’re just serving content. The problem seems to come from shoddy templates and bloated plug-ins which bulk up the page.

      • http://cannabissense.com/ Will Stevens

        I agree, a lot of themes/templates are full of bloat, and also the ease of using a CMS lowers the bar for entry to developers who may not be able to hard-code but can troubleshoot a CMS and install a new widget or plugin (along with it’s additional scripts) each time they want a new function. I use Joomla for example, and every new project begins with an install of the core CMS, followed by me disabling 1/3 to 1/2 of the core components and plugins that will never be used. I am then very selective about how I approach added functionality. And many of the CMS frameworks out there allow for serious functional tweaking, but the ease of just downloading a widget instead of investigating how to use the current framework for certain tasks is the easy way out, and therefore is likely to be the most common way for today’s developers to do things.

  • http://jitendravyas.com/ Jitendra Vyas

    One of the reason is CSS frameworks too. Like Bootstrap

  • David Miller

    I certainly agree with the gist of what you are saying, but I have to ask, where is this :” On the most efficient mobile network a 1.7Mb page will take one minute to download” coming from?

    1.7 mb in a minute? That is rather slow for even a mediocre mobile network these days, certainly not representative of the best.

    • Sai Pc

      to download 1.7 megabytes in 1 minute, you will need a ‘so-called’ 8-12 Mbps connection…afaik, mobile networks are definitely not faster than that…are they? i completely agree with the article, its just plain bloat and no one is taking steps to reduce their page sizes, or atleast optimize partially for mobile phones…

      • David Miller

        An 8-12 mbps connection should be able to download 1.7mb in a matter of seconds, not a minute.

        This hardly needs debating, think about it. If the average website is 1.7 mb and takes 1 minute to load, then that’s what we should be experiencing on average. I can’t even remember a website ever taking 1 minute to load, and my mobile network is nothing special. So unless everyone else been regularly waiting a minute for average website visits and I’ve lucked out, the number is wrong.

        • Sai Pc

          8 mbits /s => 1 megabyte/s, on a mobile network due to interference and signal strength, you would rarely get anything above 200-300 kbits/s…especially in crowded areas and major cities. Most people do not have anything above a 2-4 Mbps connection, in my country we dont even have 8Mbps fixed lines lol, but leaving aside all that, it is simply an ‘average’, which means that there are both higher and lower page weights… maybe the sites you are visiting are optimized, or you frequent those sites and they are cached… as a fun fact, the very page we are on right now, is 859.6 KB uncompressed and uncached…since some of the js components are cached already, the net page weight for me is 376.1 KB…if a simple article page can be almost an MB, i wouldnt doubt it at all that avg.page size is 1.7…;) PS: i used YSlow to check…

        • Craig Buckler

          We are considering mobile in this context. I used the same rate Sai mentioned — around 300kps which is 300,000 bits per second (not bytes). I rarely get anywhere near that and don’t expect to until 4G becomes a widespread, affordable reality.

          Obviously it depends which sites you’re using. Since you’re a developer, I guess it’s tech-savvy sites which I hope were lower than 1.7Mb and using best-practice techniques to ensure content appears quickly.

  • Jingqi Xie

    Kilobits, or kilobytes?

  • Jingqi Xie

    Average is affected a lot by maximums. So were they a webpage with 1GB of download and 1,023 blank webpages, there would be 1MB on average.

  • Anh Fam

    Are you serious now? The reason for the increase in weight despite the improvement in CSS3 compatibility is obvious: Old browsers doesn’t go away. They are still there, and we still have to cater to the users who still use them. With hacks and polyfills and the like.

    • Craig Buckler

      I don’t agree. You don’t need hacks these days and, presuming you use progressive enhancement, you rarely need polyfills or shims either. Besides, even they would add no more than a few dozen Kb of JavaScript.

    • DF

      To make css3 compatible with older browsers takes a few lines of code – hardly a footprint. I wouldn’t consider that a factor.

  • Rob

    While all applications of these newer technologies are different, I often wonder if adding extra code is truly increasing performance.

  • http://twitter.com/refreshcreation Ryan Carson

    IMHO A lot of the general site bloat is directly related to the rise of wordpress, jquery, joomla, etc. An awful lot of UX on websites caters towards the visual styling and, as years go on, less about accessibility, unless it is already ingrained in the CMS.

    Page and site bloat are two major things I take into account when designing websites still and I can happily state that my home page falls into the smaller file-size side of things, despite having a few larger images in a rotator.

    I still consider it to be on the larger side of things though, it currently weighs in at ~500kb but I have had some horrible designs sent to me involving usage of 24bit pngs for all main site images due to customer requests, resulting in pages up to 4MB after compression and adjustment for improving site speed rankings.

    Perhaps we need an annual competition to make a stylish webpage suing the basics html, css, javascript and some imagery, with a small upper limit in terms of filesize say 50 – 100kb to get smaller websites back in-vogue?

  • HenriHelvetica

    There’s so much to consider. Recall when ppl went on diets but were still drinking large glasses of OJ that had loads of (albeit it natural) sugar? The same may apply.

    CMSs are getting heavy – that’s for sure. Some do abuse frameworks – yes, using 2 features, and installing the full package when going live (like Bootstrap). I’ve been looking at weights of late, and I’ll see a bunch of img files that have not been optimized – shockingly.

    But there will always be a page weight vs site design battle. We’re constantly given tools to do more, but more than ever are being reminded of pw. Just need to find a good and sensible middle ground.

  • http://tobto.org/ seo freelancer

    more ‘quality content’ – more hi-res images, kb, retina ready pics, etc. In fact, nothing bad, if you impress your customers with top notch original captures of your activity. Big issue if your audience sleep with a responsive Teddy Bear.

  • Craig Buckler

    So companies cannot afford conscientious development but are happy to lose up to one in five customers? Isn’t it up to developers to educate clients? They may not listen or care but, if they want a badly-built site, they can hire a bad developer.

  • DF

    In response to #3 (web debs are lazy) – what you failed to take into account is that most of us freelancers are working under a budget and time constraints. It takes time to trim the fat and the majority of clients will not want to pony up for it. I’d have a really hard time convincing a client to spend another 10-15% to tighten up the code and shave some seconds off page loading time when he’s already happy with end result – and it was a hard enough sell to get their budget to move to pay for the “sloppy but invisible to client” coded website.

    Now if I’m working full time for a company and the company needs to keep me busy with a constantly improving website, then the work makes more sense.

    In the real world of freelancers, you gotta make the client happy within budget (the budget itself is a strong factor in that client’s happiness) then move on to the next

    I’m not trying to completely discount the value of trim code. But I think it’s one of the last things on most client’s minds. And the customer is always right :-)

  • George

    Been saying this all along. Lazy programmers make for bad code.

  • Francisco Alvarez

    In my personal experience the problem most of big sites have is third party tools. All leading websites are using more and more analytics, printing more ads, and sharing content with a million different social networks. While you can develop good performant code, you’ll always have requirements that kill your page load.
    It is a total shame, but it’s also what makes $$ on the web.

  • DF

    I think #1 (bloated templates ) and #3 (developer laziness – or lack of knowledge) should be one and the same, and probably account for 90%+ of the cases. I see it all the time – graphic designers promoting themselves as developers, who really only know how to take a WordPress template and modify a few graphics here and there. The template is still referencing all of the scripts that aren’t even being used.

  • DF

    I agree with this 100%. Most clients that I deal with, I’d have a hard time selling them on a 20% increase in budget for me to go through and clean code and tighten up code. Most are only concerned with what they can see. Shaving off a second or two from page load is not something they are going to notice or wasn’t too pay extra for. Now if you can convince them that they will rank higher in the search engines and turn more profit, then you may have a possibility.

    • Jay76

      I work in web analytics, and it is (sometimes) possible to show an increase in conversions when pages load faster. Even a half second decrease can have a noticeable impact if the audience is large enough.

      Of course, you need to convince the client to take that initial investment before you can measure it, but it does have some impact.