Average Page Weight Increased Another 16% in 2015

By Craig Buckler
We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now

It’s happened again. The HTTP Archive Report, which collates technical information from half a million of the web’s most popular websites, reports that average page weight increased 16% during 2015 to reach 2,262KB. A similar increase was observed in 2014.

The report analyzes publicly-accessible content and shopping web sites. It cannot analyze web applications or pages behind a login such as Facebook.

technology end 2014 (KB) end 2015 (KB) increase (%)
HTML 59 66 12%
CSS 57 76 33%
JavaScript 295 363 23%
Images 1,243 1,443 16%
Flash 76 53 -30%
Other 223 261 17%
Total 1,953 2,262 16%

Figures are averages. Page weights are unlikely to follow a normal distribution but the numbers seem reasonable when you dissect pages around the web.

HTML content has risen by 7KB. Is the average article 12% longer? I doubt it.

Unsurprisingly, Flash has also dropped by 23KB or 30%. One in five sites continue to use Flash — a drop of 26% over the year.

CSS increased by 19KB. There’s little excuse for using eight stylesheets but the overall file size hike seems reasonable given:

  1. CSS capabilities increase over time. We’re using effects, animations and responsive layouts which were not possible a few years ago (whether they’re necessary is another matter).
  2. Pre-processors such as Sass, LESS and Stylus have a tendency to bulk code because nesting and property reuse is easy.
  3. Build tools make it easier to inline image assets.

Features such as CSS Flexbox can help reduce more complex float-based layouts but the savings are fairly minimal.

You could have expected JavaScript code to drop accordingly but it’s grown by 68KB to reach 363KB distributed over 22 separate files. That’s a lot of code. Some will be frameworks and libraries, but I suspect the majority is social media widgets and advertising.

Other files, such as fonts and videos, have increased by 38KB or 17%.

As usual, the biggest rise is from images. We’re loading an additional 200KB, which accounts for 65% of the overall growth. 55 separate image files are accessed per page, which seems excessive.

Other highlights lowlights:

  • 25% of sites do not use GZIP compression
  • 101 HTTP file requests are made — up from 95 a year ago
  • pages contain 896 DOM elements — up from 862
  • resources are loaded from 18 domains
  • 49% of assets are cacheable
  • 52% of pages use Google libraries such as Analytics
  • 24% of pages now use HTTPS
  • 36% of pages have assets with 4xx or 5xx HTTP errors
  • 79% of pages use redirects

Why Have Pages Bloated?

There’s a simple explanation for 2.2MB pages: we’re doing a terrible job.

As a developer, I love the web. As a user, it’s often awful. Sites are desperate to “increase engagement” with intrusive advertising, annoying pop-ups, under-used social media cruft and invasive tracking. Perhaps this leads to momentary revenue gains, but the increased bloat is counter-productive:

  • Google downgrades heavyweight sites which can harm search engine optimization efforts.
  • Advertising claims to keep content free. Will users consider it free when they discover it costs $0.40 to view one page on a typical mobile data plan?
  • The elevation of ad-blockers to mainstream consciousness during the year highlights user frustrations and the ease at which anyone can abolish irritating content.
  • Users do not wait. Etsy.com discovered that 160KB of additional images caused their bounce rate to increase 12% on mobile devices.
  • Web activities are starting to attract government attention. For example, UK mobile operators can be fined if a service using their network makes gains from misleading campaigns. Regulation will inevitably escalate as sites become more desperate.

Content is being drowned in cruft. Uncompressed, Shakespeare’s 37 plays total more than 800 thousand words or a 5MB download. Now consider Facebook’s Instant Articles overview which suggests an alternative to bloated pages. It contains five paragraphs yet, ironically, requires 3.5MB bandwidth plus another 50MB when you view the three-minute video. Does it convey more information than the combined works of Shakespeare? Perhaps it’s prettier, but is it necessary to show a 267KB image of a chap holding an invisible ferret?

Obesity Denial

I’ve published several of these articles. Many claim there is no obesity crisis, so let’s tackle the primary arguments.

Some pages will always be big
Image galleries, games, technical showcases etc. will always be chunky. Yet every developer can justify the weight of their pages. The HTTP Archive Report mostly analyzes content articles and online shops. 2.2MB per page — the equivalent of half Shakespeare’s plays — is ridiculous.

Page weight is not an indication of quality
In my experience, page weight is inversely proportional to quality. Bulky pages are often link-bait articles or meaningless marketing extravaganzas. There are exceptions, but wouldn’t it be great if there were a tool to rate content and compare that against bloat?

Users never complain about overweight pages
…because they abandon them. Analytics record those who successfully access a site. They won’t highlight those who couldn’t load obese pages or never returned again.

Bandwidth capacity increases every year — page weight is negated
Did your bandwidth on all networks increase by 16% in 2015? And 15% in 2014? And 32% in 2013? And 30% in 2012?

Even if it did, it doesn’t follow everyone had the same experience. Smartphone access is exploding, yet mobile networks remain slow in many places around the world.

Even if we presume connectivity is excellent everywhere, do developers have a duty to use the increased capacity? Have pages become 16% better than last year?

Slimming pages means dumbing down, with fewer features and effects
Why? In some cases sites can make incredible savings with a few minutes’ effort. Activating compression and concatenating assets won’t change anything except performance.

An obsession with performance leads to more complication and maintenance
Removing unused or unnecessary images, assets, features and widgets simplifies a site. It’ll lead to fewer complications and less maintenance. No one is suggesting you should fret over every byte, but look after the kilobytes and the megabytes will look after themselves.

Additional page weight is the price of progress
In some edge cases, perhaps. Lightsaber Escape has a 120MB payload because it’s a revolutionary and experimental browser-based game (which is not analyzed by the HTTP Archive). Can the same be said for Apple’s 11.2MB iPhone 6S page, which delivers a few paragraphs and effects which (mostly) could have been achieved a decade ago in a fraction of the size?

SitePoint’s pages often exceed 2MB
Ahem. Oh look — a squirrel…

SitePoint pages can be heavy on desktop devices, although they reduce to a few hundred kilobytes when viewed on a small-screen over a slower network. No site is perfect, and performance efforts are being made, but I agree it could be improved.

Why should I bother? Few others do
The reason: there’s no downside. Your users benefit from more responsive pages and a slicker experience. Your search engine ranking improves. Your conversions increase. Your hosting charges drop. You’re even reducing electricity consumption and doing your bit to save the planet. It’s a little extra effort, but the most drastic optimizations result from the simplest changes — see The Complete Guide to Reducing Page Weight.

Unconscious Obesity

Developers and site owners unconsciously created the obesity epidemic, but considerable savings can be made by addressing:

  1. Images and videos. Is a hero necessary? Can the file be resized or reduced? Could it be replaced with CSS effects? Are content managers uploading monolithic files? Will any user see all 55 images?
  2. Third-party content. Advertising and social media scripts can be ludicrously wasteful. Are they necessary? Are there better options?
  3. Attitude. Performance rarely concerns those sitting on a 100MB connection. Try throttling bandwidth, connecting via 3G or using hotel Wi-Fi for a while — it may change perceptions.

It’s far easier to gain weight than lose it. Adding a few extra widgets or graphics seems reasonable, but it soon mounts up. Going back and removing superfluous junk can be tedious and difficult. The answer seems obvious: don’t add it in the first place.

Few site owners care about bloat. Performance is a developer mind-set. Consider it from the start of the project and it will not significantly increase development time. Addressing performance after your pages hit obese levels is too late.

We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now
  • James Ferrell

    Might be a silly question, but do the image statistics take responsive images into account?

    • Nicolae Crefelean

      That’s a very good question, actually. It boils down to what the users get, not what the content owners serve the client. And it’s not just about dynamic content on the websites but also external content out of anyone’s control. So the actual load times and sizes of the web pages can easily ramp up significantly when the ads kick in.

    • Craig Buckler

      I would expect so. The only effective way to collate these statistics is to use a browser which renders the page and executes JavaScript. PhantomJS or similar is likely to be used.

      The site therefore loads as a desktop browser would see it. It’s possible the system changes the resolution to emulate a mobile device but we can presume the analysis fetches the best heavyweight image.

      That said, I doubt responsive images are a significant cause of growth. They’re primarily used for a few assets and the technology is fairly new so relatively few sites use them.

      In addition, responsive images only adapt to resolution or DPI. A Retina-screened iPhone on 3G can request more data than a low-end laptop on wifi.

  • Tatsh

    Most of the time I do not disagree, but there is such a thing as a ‘get work done’ site vs a random click-bait site with slideshows and lots of ads.

    The ‘get work done’ site could be your bank, or something for your actual job. Regardless of how terrible that platform could be in terms of page weight (and most likely UI), you still have to use it.

    The big question is: how do you convince the big companies to stop not caring about the waste? That goes for both categories of sites.

    I know from working ad tech that the ‘waste’ is inevitable when you let your advertisers walk all over you (such as demanding crazy complex slideshows) because you are afraid to lose their business.

  • M S i N Lund

    Frameworks + extra-big-ass-background-images/videos.

    Everyone is doing, me want do too!!
    Me no have own self!

  • Page weight increases again ;_;

    Business and time pressures can come into it as we tend to build layer upon layer during a site’s life cycle until we get to the major redesigns. I’ve tried to instill performance cultures and smaller code sizes in a couple of places I’ve worked and the push-back seems to be against refactoring the existing code. When it comes to redesigns I’ve had some better headway, though I tend to have to argue against bringing across an entire framework (“Yes, your chosen framework will speed up the development cycle at the start, but it’s also 3 times bigger for CSS than we need for the whole site, so why don’t we use only the bits that we absolutely need?”). We need to have the conversations with stakeholders that slow websites mean fewer visitors means fewer conversions means fewer sales, while quicker websites mean more overall profit (sticking things in terms of money seems to go down well).

    @Craig Buckler:disqus, do you think there’s merit in a “CSS Zen”-style site that’s a collection of what can be done with minimal page weight? I’ve been toying with the idea of building pages for fun that are less than 50KB all told, and Marciej Ceglowski’s talk on the subject mentioned keeping pages smaller than a piece of Russian literature, so it might be worth setting up a community that shows just what can be done with minimal file sizes.

    • Craig Buckler

      Unfortunately, stakeholders often think that adding “just one more feature” will magically improve their site. Perhaps we should suggest that a feature is removed for every new one they add? Performance is rarely considered but I agree that phrasing issues in monetary terms will be more effective.

      I like the idea of a Zen Garden for page efficiency. Another option would be to take an existing page (like the iPhone page) and show how performance can be improved while keeping – or even improving – the existing layout and effects.

  • Max Beggelman

    Not all page weight is created equal, either. Images are by far the biggest byte-gobblers, but they’ve always been the first targets for middleman optimization. Whether it’s using a proxy to fetch and optimize images before delivering them to the browser (a strategy that dates back to before smartphones, and is still in use by at least one mobile browser today) or just hiding images and media altogether until the user specifically requests them (an option in some browsers), images have always been the easy fat to trim for browser makers or networks looking to take data usage into their own hands.

    Javascript, on the other hand, is much harder for users and browser makers to seek savings in on their own. While images are distinct elements within a page which can easily be separated out, JS is integral in a way which makes it much harder to selectively block or allow the user to selectively whitelist without impacting functionality. There are browser extensions that give that sort of control, but they’re hard to use, and fundamentally just can’t be as easy and intuitive as “click on this blank container if you want to load this image” is.

    CSS is even more limited to manage, and its growth is inevitable as long as web developers insist on sticking to the bleeding edge. Not only are we using more features, but vendor prefixing means each use takes up three to five times as much data as it should if we want that fresh new feature to work in every browser. Flexbox, due to its convoluted history, requires a particularly heavy load of browser-specific CSS, and that adds up fast. Is replacing every float property with five display properties (-webkit-box, -mox-box, -ms-flexbox, -webkit-flex, and flex) really saving any bytes at all?

    As for the other main course of action for cutting down JS data usage, optimizing through minimization and shedding unnecessary bytes, the disturbing part is that most of this hefty code is ALREADY minimized. Of the 8 Javascript files this page loads that are more than 100KB, each and every one of them has already undergone minimization. All these big heavy frameworks, libraries, and advertising scriptpacks – they’re already minimized. There’s no more room to optimize and trim. We need to look at trimming out actual functions, libraries, and frameworks. And if we can’t cut out advertising and fluff, for business reasons, then we need to look at cutting out the convenience tools we use to make our job easier. Advertising and other crufty scripts certainly pile up, but the biggest JS file on this page – totaling over 500KB all by itself – looks to be coming from a Javascript framework. The web community has been moving for a long time toward incorporating frameworks and libraries to make development easier, but that comes down to making pages bigger and slower in order to buy ourselves some convenience and make complicated tricks easier, and maybe it’s time to start going the other way. It’s not a pleasant solution, but optimization has its limits: at some point sacrifices do need to be made to get page sizes down, and if we can’t get anyone else to make those sacrifices, then there’s no choice but to ask ourselves how much we’re willing to sacrifice to get that page weight down.

  • Craig Buckler

    This says it all…

    “If present trends continue, there is the real chance that articles
    warning about page bloat could exceed 5 megabytes in size by 2020.”

  • VillageElder

    Less is more. All that additional code shows up on the screen as additional page elements. Busy readers (your most important readers?) tend to skip over web pages that are too cluttered with sidebars, ads, images, pop-ups, et al. They don’t have the time and the patience to skim the page and separate the wheat from the chaff (information from the fluff) before deciding what to spend their valuable time on reading/viewing. They may be quickly going through their inbox before a meeting and disregarding, without reading, anything that is too time consuming to digest. So what has providing all that additional code gotten you? Remember: Your reader’s attention is your objective.

  • Craig Buckler

    Whaaat? I’ve just checked and got 1.7Mb with ads (which is still bad). Could you identify what advert caused that? A video perhaps?

  • strangersound

    Page weight is an issue, but hardly the real problem. Average page weight being 2.2mb I can handle, but most regular pages translate into a 70mb footprint in your browser. I don’t know enough to know why, but I don’t get it. I’m failing to understand what is going on in the background that can consume so many resources. I can scroll Facebook with just few wheels scrolls and hit 350mb.