Average Page Weight Increased Another 16% in 2015By Craig Buckler
More from this author
It’s happened again. The HTTP Archive Report, which collates technical information from half a million of the web’s most popular websites, reports that average page weight increased 16% during 2015 to reach 2,262KB. A similar increase was observed in 2014.
The report analyzes publicly-accessible content and shopping web sites. It cannot analyze web applications or pages behind a login such as Facebook.
|technology||end 2014 (KB)||end 2015 (KB)||increase (%)|
Figures are averages. Page weights are unlikely to follow a normal distribution but the numbers seem reasonable when you dissect pages around the web.
HTML content has risen by 7KB. Is the average article 12% longer? I doubt it.
Unsurprisingly, Flash has also dropped by 23KB or 30%. One in five sites continue to use Flash — a drop of 26% over the year.
CSS increased by 19KB. There’s little excuse for using eight stylesheets but the overall file size hike seems reasonable given:
- CSS capabilities increase over time. We’re using effects, animations and responsive layouts which were not possible a few years ago (whether they’re necessary is another matter).
- Pre-processors such as Sass, LESS and Stylus have a tendency to bulk code because nesting and property reuse is easy.
- Build tools make it easier to inline image assets.
Features such as CSS Flexbox can help reduce more complex float-based layouts but the savings are fairly minimal.
Other files, such as fonts and videos, have increased by 38KB or 17%.
As usual, the biggest rise is from images. We’re loading an additional 200KB, which accounts for 65% of the overall growth. 55 separate image files are accessed per page, which seems excessive.
- 25% of sites do not use GZIP compression
- 101 HTTP file requests are made — up from 95 a year ago
- pages contain 896 DOM elements — up from 862
- resources are loaded from 18 domains
- 49% of assets are cacheable
- 52% of pages use Google libraries such as Analytics
- 24% of pages now use HTTPS
- 36% of pages have assets with 4xx or 5xx HTTP errors
- 79% of pages use redirects
Why Have Pages Bloated?
There’s a simple explanation for 2.2MB pages: we’re doing a terrible job.
As a developer, I love the web. As a user, it’s often awful. Sites are desperate to “increase engagement” with intrusive advertising, annoying pop-ups, under-used social media cruft and invasive tracking. Perhaps this leads to momentary revenue gains, but the increased bloat is counter-productive:
- Google downgrades heavyweight sites which can harm search engine optimization efforts.
- Advertising claims to keep content free. Will users consider it free when they discover it costs $0.40 to view one page on a typical mobile data plan?
- The elevation of ad-blockers to mainstream consciousness during the year highlights user frustrations and the ease at which anyone can abolish irritating content.
- Users do not wait. Etsy.com discovered that 160KB of additional images caused their bounce rate to increase 12% on mobile devices.
- Web activities are starting to attract government attention. For example, UK mobile operators can be fined if a service using their network makes gains from misleading campaigns. Regulation will inevitably escalate as sites become more desperate.
Content is being drowned in cruft. Uncompressed, Shakespeare’s 37 plays total more than 800 thousand words or a 5MB download. Now consider Facebook’s Instant Articles overview which suggests an alternative to bloated pages. It contains five paragraphs yet, ironically, requires 3.5MB bandwidth plus another 50MB when you view the three-minute video. Does it convey more information than the combined works of Shakespeare? Perhaps it’s prettier, but is it necessary to show a 267KB image of a chap holding an invisible ferret?
I’ve published several of these articles. Many claim there is no obesity crisis, so let’s tackle the primary arguments.
Some pages will always be big
Image galleries, games, technical showcases etc. will always be chunky. Yet every developer can justify the weight of their pages. The HTTP Archive Report mostly analyzes content articles and online shops. 2.2MB per page — the equivalent of half Shakespeare’s plays — is ridiculous.
Page weight is not an indication of quality
In my experience, page weight is inversely proportional to quality. Bulky pages are often link-bait articles or meaningless marketing extravaganzas. There are exceptions, but wouldn’t it be great if there were a tool to rate content and compare that against bloat?
Users never complain about overweight pages
…because they abandon them. Analytics record those who successfully access a site. They won’t highlight those who couldn’t load obese pages or never returned again.
Even if it did, it doesn’t follow everyone had the same experience. Smartphone access is exploding, yet mobile networks remain slow in many places around the world.
Even if we presume connectivity is excellent everywhere, do developers have a duty to use the increased capacity? Have pages become 16% better than last year?
Slimming pages means dumbing down, with fewer features and effects
Why? In some cases sites can make incredible savings with a few minutes’ effort. Activating compression and concatenating assets won’t change anything except performance.
An obsession with performance leads to more complication and maintenance
Removing unused or unnecessary images, assets, features and widgets simplifies a site. It’ll lead to fewer complications and less maintenance. No one is suggesting you should fret over every byte, but look after the kilobytes and the megabytes will look after themselves.
Additional page weight is the price of progress
In some edge cases, perhaps. Lightsaber Escape has a 120MB payload because it’s a revolutionary and experimental browser-based game (which is not analyzed by the HTTP Archive). Can the same be said for Apple’s 11.2MB iPhone 6S page, which delivers a few paragraphs and effects which (mostly) could have been achieved a decade ago in a fraction of the size?
SitePoint’s pages often exceed 2MB
Ahem. Oh look — a squirrel…
SitePoint pages can be heavy on desktop devices, although they reduce to a few hundred kilobytes when viewed on a small-screen over a slower network. No site is perfect, and performance efforts are being made, but I agree it could be improved.
Why should I bother? Few others do
The reason: there’s no downside. Your users benefit from more responsive pages and a slicker experience. Your search engine ranking improves. Your conversions increase. Your hosting charges drop. You’re even reducing electricity consumption and doing your bit to save the planet. It’s a little extra effort, but the most drastic optimizations result from the simplest changes — see The Complete Guide to Reducing Page Weight.
Developers and site owners unconsciously created the obesity epidemic, but considerable savings can be made by addressing:
- Images and videos. Is a hero necessary? Can the file be resized or reduced? Could it be replaced with CSS effects? Are content managers uploading monolithic files? Will any user see all 55 images?
- Third-party content. Advertising and social media scripts can be ludicrously wasteful. Are they necessary? Are there better options?
- Attitude. Performance rarely concerns those sitting on a 100MB connection. Try throttling bandwidth, connecting via 3G or using hotel Wi-Fi for a while — it may change perceptions.
It’s far easier to gain weight than lose it. Adding a few extra widgets or graphics seems reasonable, but it soon mounts up. Going back and removing superfluous junk can be tedious and difficult. The answer seems obvious: don’t add it in the first place.
Few site owners care about bloat. Performance is a developer mind-set. Consider it from the start of the project and it will not significantly increase development time. Addressing performance after your pages hit obese levels is too late.