2Mb Web Pages: Who’s to Blame?

By Craig Buckler
We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now

I was hoping it was a blip. I was hoping 2015 would be the year of performance. I was wrong. Average web page weight has soared 7.5% in five months to exceed 2Mb. That’s three 3.5 inch double-density floppy disks-worth of data (ask your grandparents!).

According to the May 15, 2015 HTTP Archive Report, the statistics gathered from almost half a million web pages are:

technology end 2014 May 2015 increase
HTML 59Kb 56Kb -5%
CSS 57Kb 63Kb +11%
JavaScript 295Kb 329Kb +12%
Images 1,243Kb 1,310Kb +5%
Flash 76Kb 90Kb +18%
Other 223Kb 251Kb +13%
Total 1,953Kb 2,099Kb +7.5%

The biggest rises are for CSS, JavaScript, other files (mostly fonts) and—surprisingly—Flash. The average number of requests per page:

  • 100 files in total (up from 95)
  • 7 style sheet files (up from 6)
  • 20 JavaScript files (up from 18)
  • 3 font files (up from 2)

Images remain the biggest issue, accounting for 56 requests and 62% of the total page weight.

Finally, remember these figures are averages. Many sites will have a considerably larger weight.

We’re Killing the Web!

A little melodramatic, but does anyone consider 2Mb acceptable? These are public-facing sites—not action games or heavy-duty apps. Some may use a client-side framework which makes a ‘single’ page look larger, but those sites should be in the minority.

The situation is worse for the third of users on mobile devices. Ironically, a 2Mb responsive site can never be considered responsive on a slower device with a limited—and possibly expensive—mobile connection.

I’ve blamed developers in the past, and there are few technical excuses for not reducing page weight. Today, I’m turning my attention to clients: they’re making the web too complex.

Many clients are wannabe software designers and view developers as the implementers of their vision. They have a ground-breaking idea which will make millions—once all 1,001 of their “essential” features have been coded. It doesn’t matter how big the project is, the client always want more. They:

  1. mistakenly think more functionality attracts more customers
  2. think they’re getting better value for money from their developer, and
  3. don’t know any better.

Feature-based strategies such as “release early, release often” are misunderstood or rejected outright.

The result? 2Mb pages filled with irrelevant cruft, numerous adverts, obtrusive social media widgets, shoddy native interface implementations and pop-ups which are impossible to close on smaller screens.

But we give in to client demands.

Even if you don’t, the majority of developers do—and it hurts everyone.

We continue to prioritize features over performance. Adding stuff is easy and it makes clients happy. But users hate the web experience; they long for native mobile apps and Facebook Instant Articles. What’s more, developers know it’s wrong: Web vs Native: Let’s Concede Defeat.

The Apple vs Microsoft Proposition

It’s difficult to argue against a client who’s offering to pay for another set of frivolous features. Clients focus more on their own needs rather than those of their users. More adverts on the page will raise more revenue. Showing intrusive pop-ups leads to more sign-ups. Presenting twenty products is better than ten. These tricks work to a certain point, but users abandon the site once you step over the line of acceptability. What do clients instinctively do when revenues fall? They add more stuff.

Creating a slicker user experience with improved performance is always lower down the priority list. Perhaps you can bring it to the fore by discussing the following two UX approach examples …

Historically, Microsoft designs software by committee. Numerous people offer numerous opinions about numerous features. The positives: Microsoft software offers every conceivable feature and is extremely configurable. The negatives: people use a fraction of that power and it can become overly complex—for example, the seventeen shut-down options in Vista, or the incomprehensible Internet options dialog.

Apple’s approach is more of a dictatorship with relatively few decision makers. Interfaces are streamlined and minimalist, with only those features deemed absolutely necessary. The positives: Apple software can be simple and elegant. The negatives: best of luck persuading Apple to add a particular feature you want.

Neither approach is necessarily wrong, but which company has been more successful in recent years? The majority of users want an easy experience: apps should work for them—not the other way around. Simplicity wins.

Ask your client which company they would prefer to be. Then suggest their project could be improved by concentrating on the important user needs, cutting rarely-used features and prioritizing performance.

2015 Can be the Year of Performance

The web is amazing. Applications are cross-platform, work anywhere in the world, require no installation, automatically back-up data and permit instant collaboration. Yet the payload for these pages has become larger and more cumbersome than native application installers they were meant to replace. 2Mb web pages veer beyond the line of acceptability.

If we don’t do something, the obesity crisis will continue unabated. Striving for simplicity isn’t easy: reducing weight is always harder than putting it on. Endure a little pain now and you’ll have a healthier future:

It’s time to prioritize performance.

We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now
  • Todd Zmijewski

    I think you’re being melodramatic.

    • Tatsh

      I kind of feel the same way. Size of a web page gives no indication of efficiency or quality. It is specific to each site and the content being displayed.

      • Craig Buckler

        Demos, photo galleries and games aside, I can’t think of any content page which needs to be 2Mb. And what about the 100 HTTP requests? There’s little excuse for that.

        • Tatsh

          The average user is looking at sites that have tons and tons of resources. If the page shows *something* as opposed to nothing in short time the user hardly cares. They will quite happily navigate to the next page before the one they are on finishes.

          Yes companies try to conserve resources as always (compression, lossy compression, minification, compilation with optimisations (like removing dead CSS, dead JS), etc etc). However, there is no way you are going to get a < 2 MiB fully filled Amazon.com homepage or Facebook view. That is just not realistic.

          Imagine Amazon.com with all CSS for elements currently using images and leaving the rest to highly compressed tiled images. There must be a reason they are not doing this, be it cross-browser compatibility, look and feel of fonts across OS's, CSS quirk annoyances, and the ease of updating for the promotions and user-specific things like my wishlist and my recommendations.

          • TimTylor

            It’s not just slow loading, it’s bad performance. Browser freezeups and crashes. Snail-speed scrolling. No user is happy with a site that gives them those marvelous experiences.

  • Sassigeek

    Whilst it’s ideal that websites should be as small as possible and have better performance I think it’s quite normal as web technologies improve that file sizes will naturally increase. But I do agree where possible we should try to keep things as compact as possible. After all a huge chunk of the world don’t have very good broadband and that’s not to mention the 2.1 million people still using AOL dialup.

    • Craig Buckler

      Newer web technologies allow us to reduce the size of pages. HTML5 has features such as form validation which no longer require scripting. Images can be replaced with CSS3 effects. JavaScript features lessen the need for libraries. So why do pages continue to grow at such an alarming rate?

      • Todd Zmijewski

        Really… HTML5 form validation. HTML5 form validation completely sucks. The “feature” provides very little in terms of practical use.

        • Craig Buckler

          That was just an example. That said, basic HTML5 form validation is fine for many forms. You can enhance it if necessary but, the point is, you don’t need masses of JavaScript to do that (see my HTML5 validation constraints articles).

          • Roman

            The biggest factor in page weight is IMAGES, not JS scripts.

          • Craig Buckler

            That’s partly true but it’s not the whole story.

            Images are non-blocking. The page can be downloaded and viewed with or without images. There are also technologies such as Opera Turbo which can compress images further if required.

            JavaScript is a blocking technology. Unless you’re careful, every script tag stops other assets loading at the same time. Then you need to consider the processing required. 300Kb+ of JavaScript has a big performance hit because that code needs to be interpreted and executed on every page.

            Neither figure looks good.

    • Roman

      The chunk of the world that is still on dial-up may be providing useless visitors to many website owners (because they don’t have enough money to buy stuff that is being offered on those sites).

      • MB

        Plenty of people with plenty of money (although this article is about CONTENT, not COMMERCE sites) don’t have access to high speed internet. And the biggest growing markets for internet access are on mobile devices, often at 3G or *LOWER* speeds. The lack of 4G where you happen to be has no relation to your ability to spend money.

  • Which came first? The solution or the problem?

    Without the “problem” of fatter web pages, we wouldn’t have needed the solution of faster internet.

    We live in a technological world. And yet we want to under utilise that? We want to go back to simple web pages of the ’90s (albeit without the horrideous styling)?

    The answer isn’t dumbing out sites down to the lowest common denominator.

    Adaptive is the first step. Use a library like Mobile Detect to decide the what device-type is behind used – phone, tablet, desktop.

    Run a script to detect connection speed and deliver a page based on that.

    Just because I can only drive at 110kmh in Australia, doesn’t mean when on the German autobahn I should still drive only at 110kmh. And vice-versa.

    It’s about being adaptive.

    BTW Your maths is misleading. You can’t look at percentages of each type compared to itself. Smaller numbers will have a greater percentage increase for smaller changes. e.g. A mere 6KB of CSS is 11% whereas 67KB of images is only 5% increase. Which implies the problem is CSS? lol.

    If you look at it as a percentage of the overall change, then CSS contributes 4% to the extra weight, but images have added 45%. That is no surprise given the boom in full screen background images.

    • Craig Buckler

      It’s not about dumbing down. If anything, most of these social networking pop-up-utilising spammy linking “features” are click-bait to entice the less experienced sectors of the web community.

      Nor is it about under-utilising what we have. But an average of 2Mb per page is ridiculous especially when most of the sites analyzed offer basic content. Why do you think most browsers now offer a reading view and Facebook Instant was born?

      Finally, detecting the device is pointless. I might be using a good smartphone on wifi or a ten year-old PC on dial-up. You should really be looking at the network and processing capabilities but none of that is practical (yet).

      The point is you don’t know where or why your site will be used. So why is it wrong to strive for performance regardless of the device?

      • Jeff M

        > why is it wrong to strive for performance regardless of the device?

        Performance for performance’s sake usually isn’t a good thing. “An obsession with performance can damage development by complicating code, leading to more bugs and more debugging, and making maintenance more difficult and costly.”

        People’s counter-arguments to your article tend to also mention faster network speeds. What if, for example, the growth of internet speeds outpaced the growth of web page weight? In that case, page performance might actually be *better* today than it was a year ago. Plus, computers have gotten faster, browsers have gotten faster. So rather than focusing on one particular detail (weight), try instead focusing on the final goal: user experience. For example, if you wrote an article that said a few years ago pages were ready and responsive in 200ms and today it’s 5s, correlated with a higher bounce rate, then that would be something to stop and think about.

        But even then, weight is low-ish on the list of how we would solve that problem. Reducing HTTP requests (same weight, just combined files), using a CDN, using caching, etc, are still more important than page weight.

        • Craig Buckler

          I’ve never heard anyone claim performance isn’t necessarily a good thing. Premature optimization, perhaps, but not performance. It’s normally about removing unnecessary stuff and addressing the bottlenecks. How does that lead to more complications and bugs?

          You’re also stating that reducing HTTP requests is a good thing. That’s addressing performance. How is it less complicated than the other performance options?

          When people are sitting on a 100Mb pipe it’s easy to neglect network speeds. But are you considering those using 3G? If anything, I’d suggest average network speeds have dropped owing to the massive increase in mobile activity.

          Let’s put this to the challenge: why not post a selection of content sites which justify their 2Mb+ weight. I’m yet to see one.

          • Roman

            > But are you considering those using 3G?

            Users accessing the web at those speeds may not be universally useful to website owners.

          • Craig Buckler

            How can you ever prove that? The only realistic way would be to create a good site, decrease the performance and watch how many users drop from the logs!

            And why discriminate against anyone regardless of their device or network? Would you purposely prevent users accessing a site with a screen reader because they’re “not useful”?

          • Roman

            There may be situations when a user accessing a site with a screen reader isn’t useful to the site’s goal.

            Also, you confused not caring with purposeful prevention. That is a big mistake.

          • Craig Buckler

            Purposeful prevention? These are content sites. What possible reason could there be to actively block anyone? The only person you’re harming is you.

          • Roman

            You have also confused not caring with purposeful prevention. Nobody here has advocated purposeful prevention.

          • Craig Buckler

            By creating 2MB pages, you prevent access whether you knew it or not.

          • Taylor

            Well for starters this page is more than 3Mb

  • Nicolas Hoffmann

    I think the option “develop yourself and make light and simple” could be a good option in some (a lot of ?) cases. We don’t need boostrap and all what it offers for simple websites.

    Especially the share buttons : 600 kb for 5 images/links. Gosh.
    The Facebook badges/modules are especially heavy.

    Also for layout and retina optimization : use SVG for layout if possible. It allows to reduce filesize, and it is future-ready. :)

    • SVG is great but many times it takes longer to develop an SVG resource.

  • Outside the Marginals

    “I’ve blamed developers in the past, and there are few technical excuses for not reducing page weight. Today, I’m turning my attention to clients: they’re making the web too complex.”

    I would suggest that you may have misunderstood who your client is. Your client is not the techie or pseudo-techie sitting across the (virtual) table from you – that’s the nominated contact. Your client is the organisation. Now they may have been foolish in nominating a pseudo-techie as the contact, but if developers accept that situation and do not try to get through to the real client (or even their customers) they must expect these sort of problems.

    So perhaps we should still blame the developers?

    • Craig Buckler

      It doesn’t matter who you define as the client but, I agree, developers can point out the issues. Unfortunately, we’ve all been instructed to implement idiotic features despite protests. Sometimes, it’s easier to give in.

      • Agreed – it’s *always* easier to give in, but rarely improves anything. It maintains the status quo – which yes, may mean your job, but also often involves someone without relevant skills dictating to someone who *has* got them. This master/servant relationship leads to stagnation, bloat and project flaws that leave the impression that the designers and devs didn’t know their job. Cue loss of self-esteem and professional authority, perpetuating the problem.

        There’s little point in the community discussing best practices if the actual practice is determined by those who don’t listen – but there’s a *reason* they don’t listen. It’s because we often aren’t speaking their language. Paul Boag has long argued that we need to develop peer/peer relationships with clients/employers – to be seen as experts *worth* listening to. This means employees convincing managers that communicating their expert advice upwards makes them look smarter to *their* bosses. This can seem impossible, but many things do – until you try. Freelancers should have it easier. Successful projects help our portfolios as much as the client’s business – but we need to present that as a reason for clients to default to trusting our judgement.

        Either way, the authority to insist on best practices won’t just fall into our laps – in that sense, we’re as guilty of failing to communicate as the clients/managers that “don’t listen.”

        • Craig Buckler

          I totally agree and that’s the point I’m trying to make. Developers aren’t perfect, but they certainly know more about the web than the client — otherwise that client wouldn’t need a developer.

          So yes; speak their language, offer proof that features don’t work well, and wear “I told you so” T-shirts every day!

        • Craig Buckler

          I totally agree and that’s the point I’m trying to make. Developers aren’t perfect, but they certainly know more about the web than the client — otherwise that client wouldn’t need a developer.

          So yes; speak their language, offer proof that features don’t work well, and wear “I told you so” T-shirts every day!

  • absolutholz

    Great rant! I’ve worked on too many sites that were above the 2MB threshold. Although I know there can be very valid reasons for exceeding this, and good methods of doing it without losing usability points, my experience has shown me that almost no one cares about this. Frankly it has caused me to question whether my passion for web development couldn’t be better used elsewhere.
    My last project had 5 tracking systems used on every page, and an additional one for mobile devices. These ‘of course’ had to be loaded before the content. I suggested loading these at the end of the page, and/or eliminating 2 or 3 completely. My logic was that counting users before we know that they had received the page’s content may be less than useful. This of course was blasphemous and I was flogged mercilessly until I agreed to add 5 more ad slots to all pages.
    My current project has no img tags in content teaser lists on its various landing pages because these are ‘bad for seo’. This from a fellow developer, who considers himself a UX expert. I could go on, and would if I weren’t on my phone in the park.
    Are there any clients out there that actually care about quality of code and usability and realize that it can have a direct impact on their revenue streams? Maybe like Amazon’s classic 100ms study. I don’t think amazon is going to come knocking at my door any time soon, so I’ll just continue the good fight of arguing for perceived performance and hope that one day I can build a site that isn’t ad-destroyed … I mean driven.

    • I remember legacy pages i was working on: 4 tracking js pixels + some shitty closeouts +4css files+ legacy jquery + jquery ui + jquery validation to validate name and email xD.
      At least when i made new page from scratch it had only tracking codes + lighter closeout and 1css file and made that responsive.
      Guess what? Boss/client didnt care and notice change but I feeled good about new page.

    • Craig Buckler

      I feel your pain.

      Five tracking systems?! That’s mad. They almost certainly won’t agree with each other so what’s the point?

      And, yes, I’ve seen sites sacrificed at the SEO altar. Generally because an SEO “expert” read something in 1998, taken it to be gospel, and added it to their site, page, website, webpage, web page, kitten, britteny spears.

    • Craig Buckler

      I feel your pain.

      Five tracking systems?! That’s mad. They almost certainly won’t agree with each other so what’s the point?

      And, yes, I’ve seen sites sacrificed at the SEO altar. Generally because an SEO “expert” read something in 1998, taken it to be gospel, and added it to their site, page, website, webpage, web page, kitten, britteny spears.

  • M S i N Lund

    Clearly its the work of the European Union and their godless liberal gay nazi agenda.

  • most of the time devs do what their boss/clent say. And boss/client doesnt care about performance – he needs 1000 javascript things and ton of images…

    And devs dont argue – who pays money gets music he wants. In this scenario it’s fair. Devs are making site for boss/client not users.

    Second scenario when devs are in a hurry or incompetent. This scenario should be eliminated.

    Im not saying we as devs need to spend ton of time on optimization but at least make the most simple and effective things: optimize imgs, minify css, js, reduce reqeusts when can and try to make this shit work on smaller screens a bit better.
    Because ive seen huge amount of websites that can be improved hugely if spend 1hour or less of time.

    And yes as for me ppl are overusing websites. Imo in most cases you can lower webpage size and reqeusts by 80% with users didnt even notice it.

  • Ed Brandon

    I too get annoyed at the size of some pages I have to download. I have to pay to download them – the more bandwidth I use, the higher the ISP bill.

    Newspapers, TV channels and such typically display particularly heavy pages.

    One feature I’m becoming more and more annoyed with is infinite pages. Not sure if Facebook pioneered this, but they are a good example. As you scroll down, more is presented.

    I’d love to see us get back to more use of menus (top and left), properly proportioned with depth, so one never needs to see too much, nor have to go too far to get what is of interest.

    But perhaps the handheld device is part of the problem Menus here are a bit harder to deal with.

    In short, I don’t return to large sites more than I need to.

  • Jon James

    At first I thought “that’s awful” then I thought, with bandwidth being what it is these days, and with caching… Maybe it’s really not that bad.

    The floppy disk comparison is somewhat petty IMHO. Like complaining about 1080p screen resolution or 16k colors… Things just get more advanced.

    • Craig Buckler

      Are things really more advanced? What are you doing on a webpage today which you couldn’t do, say, five years ago. Or ten for that matter (post Ajax).

      I don’t think the bulk is necessarily adding anything useful. A large proportion are sharing buttons which appeal to a tiny fraction of users yet take 500Kb+ (see Fat-Free Social Buttons at http://www.sitepoint.com/social-media-button-links/).

  • Sp4cecat

    Surely, the metric should be page load time and user experience – page weight, the combined memory taken up by all the parts that make the page, is a number that means nothing.

    Like many old school, first gen web developers, I used to sweat over getting that gif down by 1kb, using text and tables instead of images, cranking the weight down to 100kb – all so it could load just as fast as any page now, if not slower.

    2mb’s a shocking amount, if squinted at through a 1990’s CRT display, but the figure really means nothing now.

    • Craig Buckler

      Try downloading over 3G – it definitely means something then!

      • Sp4cecat

        Well, yes .. but that is .. 750x more than 2mb. I look forward to your follow-up article in 5 years: “3GB web pages – who’s to blame?” :)

        Ironically, I used to load a full website on to a 1.44mb floppy to demo to clients, with room for zork.exe .. ahhh the 90’s.

        • 3G would be the mobile network (though there probably is a 3GB page somewhere!). Not everyone will be on WiFi all the time, and a huge percentage of the global population is probably not on WiFi or broadband at all, but on mobile data networks.

          I’d go one step further and suggest either a dial-up connection, or 3G mobile signal in a remote part of a developing nation.

          • Roman

            Not everybody cares about good user experience for visitors “in a remote part of a developing nation”. A goal for having a website is often to make money by selling a product or service. If a visitor can’t afford to buy, then he isn’t a useful visitor.

          • Apologies, I should have used a better example.

            If someone lives outside of a major city then there’s a strong chance they have a poor mobile signal. Some parts of the UK that I know of have almost no mobile signal and poor WiFi/broadband connectivity. These users might use the internet at home (albeit slowly), but when they’re out and about they could have no chance to not through lack of desire, but because they know it’ll take forever to download a page over their sporadic mobile connection.

            Some companies have WiFi at their offices for guests, but the connection is sporadic and slow (it took about 20 minutes to watch a 3 minute YouTube video this morning off my phone). IT policies can prevent staff using their PC or laptop to browse or shop, so people use their phone at lunch time.

            In either of the cases above, the user is a typical person with an average income who happens to live in an area with poor internet connection, both broadband and mobile. If a page is bigger than it needs to be then they won’t use that site again, and the company would be losing out on sales. Why should a user have to do all their browsing in an internet cafe several miles from their home because companies feel that they don’t need to revisit their pages to reduce the weight and bring performance up to speed?

          • Roman

            Yes, this example is better than the previous one. However… do you know about the 80/20 rule? It is possible that spending time and money to change a website to cater to the people you described isn’t sensible.

          • Craig Buckler

            So are you saying it’s OK to create 2MB pages if it saves time and 80%+ users are fine with it? In other words: performance isn’t necessarily important?

            We’ll have to agree to disagree. I don’t think there’s any excuse. How did that page reach 2MB? Are all those features strictly necessary?

            Using your argument, 80% of people will use just 20% of your features. If you can identify that 20%, you’ve just saved up to 80% development time and increased performance accordingly.

          • Roman

            I don’t think that’s a correct application of the rule. Not 80% of users. Identify types of visitors that generate most of your income and ignore those catering to whom costs more than what they bring.

          • Craig Buckler

            It’s hardly a rule! But I certainly stand by the observation that 80% of people use 20% of features (it’s just a shame that it’s not the same 20% of features for everyone!)

            So you’d rather spend time identifying user types? That’s a lot of investigation work and, unless you can contact a high proportion of potential users, who’s to say your research will be accurate? It’s difficult for people to determine how and when they’ll use a system they haven’t seen.

            Why not spend that time creating a system which works well for everyone? Concentrate on the vital features and make a slick experience. Release early, evaluate and loop back. That seems far more viable than creating a bulky system which possibly can’t be used by those you didn’t/forgot to consider.

            Ultimately, though, you still appear to be saying 2MB pages are fine if it brings in revenue. The simple fact is that increased performance will bring more revenue. If you worry about performance from the start, it won’t add significant development effort. Where’s the downside?

        • Craig Buckler

          “but that is .. 750x more than 2mb”

          Not sure how you calculated that but 3G networks provide peak data rates of 200 kbit/s. That’s BITs. Not bytes. Not Kb. Not Mb. It equates to 25Kb per second — if you’re lucky. A 2Mb page takes 80 seconds to download.

          Is the misunderstanding of these terms one reason why developers think 2Mb pages are absolutely fine?

          • Sp4cecat

            Ahhhh .. sorry, I saw ‘try downloading (a website that is) over 3GB’ where you meant ‘try downloading over a 3G network’

          • 3G only provides max 200kbit/s of data? Once again, you’ve got it wrong. 3G networks must be able to provide AT LEAST 200kbit/s of data, but they’re often much much faster than this as I previously stated.

            For example, “In India, 3G is defined by telecom service providers as minimum 2 Mbit/s to maximum 28 Mbit/s.”

            I think it’s you who is misunderstanding these terms.

  • I reckon most developers throw around angular and jQuery just for kicks. Even for an extremely simple webapp, they have to use these libraries which contribute to a huge chunk of the page load.
    Angularjs : 123 kb minified
    jQuery 2 : 81 kb minified
    Emberjs : 272 kb minified

    • Roman

      What you listed are peanuts. What really is huge chunks of pages are IMAGES.

      • Craig Buckler

        Not necessarily true – see my comment above.

  • seba

    But if you’re writing a more complex app, not a simple website, isn’t 2MB impressively small?
    And it’s even cached after the first visit.

    • Craig Buckler

      As mentioned, the survey above looks at content websites. Not apps. Not games.

      2Mb is ludicrously large.

      • This problem is not confined to webpages. Software has grown larger, apps have grown larger. For example, how does a bugfix update for Facebook on iOS be 40mb? Every time? It can’t be a bugfix, you’re just re-downloading the entire app. Great if you’re on superfast broadband, otherwise, it’s a pain every time.

        Webservers the world over are getting unnecessarily hammered by repeated requests for large downloads, unminified, uncompressed files, and html markup without the whitespace removed. All uses energy and/or mobile battery.

        Yeah, I just love my Blackberry’s battery draining from the simple act of trying to view a 2mb webpage, then crashing, because some 3rd party API used on that page borks for some reason.

        • feamoignargfaionakfj9ajfopamjv

          Which BlackBerry? ;)

          • It’s more than two years old. BB9320 – not bleeding edge by any means. More like bleedin’ hell :-p

  • Craig Buckler

    So you’re saying it’s OK to have lots of stuff because few users would see it? If so, why add it?

    Facebook is an application – not a standard content site like the ones measured here. It’s mostly a single page and they work harder than most at performance.

    Amazon pages are heavy – despite recent performance claims. That said, they also offer separate cut-down mobile sites and apps to improve users experience across a range of devices. Do you have the time and budget to do that?

    • Roman

      I am sure he meant that the users would navigate to another page in another tab while the heavy page is being loaded.

      • Craig Buckler

        Really? Great. Let’s bring back loading screens.
        Users do not wait around.

  • I read the headline and almost choked on my drink!

    I know a lot of people are asking “what’s the problem?” but a user having to download 2MB of resources for a page is crazy, especially as they’ll then be downloading and rendering additional MB as they navigate the site.

    Then you move to the mobile world. While adaptive might work in some circumstances it won’t always be practical for a site to become adaptive; the technology involved will be beyond the reach of some companies (I’m thinking charities with no resources, SME businesses, start-ups in growth nations, etc.). In those cases a cleaner front-end code would let them go responsive with minimal code and still a huge flexibility in the design.

    As @shash7:disqus and others have said, the way a lot of developers use frameworks and libraries does have some blame in this. It might be quicker to use Bootstrap as the framework, but is it the right thing to do? Would writing your own framework based on the projects you use be better since you can then add things in as you need them rather than having everything there from the start? The same applies to JS libraries; are they needed because they’re quicker for you to work with, are you (like me) not able to write your own JS easily?

    I was reading an article about the 5K challenge that A List Apart used to do; it would be interesting to see whether something could be done along similar lines (maybe build an entire web page for no more than 50K of code and images?). If it can be proven that good designs and functionality can be built with minimal code it should go some way towards proving to clients and companies that performance doesn’t have to be at odds with artistic and corporate design principles.

    • Roman

      > Would writing your own framework based on the projects you use be better
      since you can then add things in as you need them rather than having
      everything there from the start?

      Those frameworks are customizable and allow one to get only what he needs.

      > The same applies to JS libraries; are they needed because they’re
      quicker for you to work with, are you (like me) not able to write your
      own JS easily?

      Who in his mind wants to deal with browsers differences?

      Remember that network connections and computing devices become faster quickly, but basic performance of human programmers remains the same.

      • Craig Buckler

        Not all frameworks are customizable and, in my experience, few developers bother because it’s easier to package everything (our performance remains the same, remember!)

        Which browser differences are you referring to? If you’re not developing for IE6/7/8, the differences are almost negligible. If you use progressive enhancement, you can ignore them altogether.

        • Roman

          Those who consider writing their own framework probably won’t be too lazy to configure an existing framework. We were comparing these two approaches, not psychological issues.

          • Craig Buckler

            Those who consider writing their own framework probably won’t use another so it’s a moot point.

            Removing stuff you don’t need is far more difficult than adding stuff you do. This is why those using frameworks such as Bootstrap rarely bother to rip out unnecessary code – it’s complicated and requires significant testing. It leads to unnecessary bloat but that’s less hassle for the developer than fixing a broken system.

          • Roman

            Considering to write a framework isn’t the same as writing it. Those who consider doing it could decide to use an existing solution.

            Frameworks such as Bootstrap and Foundation have wizards to let people get only what they need. That’s their official supported method.

          • Craig Buckler

            Official supported method? So why do all of them have a big “download” button on their home page which includes everything? (And Bootstrap doesn’t have a wizard either).

            That’s the easy option. You don’t necessarily know what you will and won’t need at the start of a project so it’s simpler to grab everything. You may have good intentions about removing the unnecessary stuff, but you probably won’t because it’s difficult.

            I don’t believe the majority of developers using a framework grab only the components they need. Do you have figures to support your theory? If it’s true, explain why we have 2MB+ pages?

          • Bootstrap doesn’t have a wizard? Of course it does. It’s right at the top of the official page under “customize”. These kinds of statement are really telling that you simply don’t know what you are talking about.

            Quit speaking for all developers. You don’t know how people work or their methods. Maybe they do grab the whole frameworks, all 29K of it in the case of Bootstrap. It’s clearly not the problem.

            We have 2MB pages because people really like images. It’s not rocket science, it’s a well known fact that images make up the bulk of data transmitted over the Internet. Case in point, loading the DISQUS home page loads 4967.6K, 4477.6K of which is made up of images. CNN’s home page loads 2234.5K, 1456.3K of which is images. Amazon’s home page loads 6414.8K, 5517.9K of which is images. The numbers are easy to understand if you bothered to take the time to look at them instead of just whinging about it. And images will load after text appears on the screen, meaning people can start to use a page much sooner than having to wait for a full 2MB to load.

      • While the frameworks do allow people to use what they need, how many sites have you visited where the CSS from the framework contains extra code that the site isn’t using? This still takes up space and needs to be downloaded. Whenever I’ve used a CSS framework, I’ve found it quicker to rewrite the stylesheet from the beginning based on the HTML that I’m using, rather than trying to unpick the stylesheet, and tend to get far smaller file sizes as a result.

        Regarding jQuery and JS libraries, they do handle browser differences. However, the vast majority of the sites I’ve seen don’t need as much JS as they use. I’m still seeing websites using JS to make rounded corners on divs, and we’ve had border-radius for several years now. I see sites keeping JS files when the functionality that used that file is no longer being used. I see sites download an entire JS framework library and use 1 part of it, which could be rewritten as vanilla to take up less than 1% of the size. I can’t think of much that I see on most websites I visit that a JS library can do that a good JS developer couldn’t recreate with vanilla code pretty quickly.

        • Roman

          I’ve done that once — wrote a small piece of vanilla JS code for the reason you described. And then I needed more, and then some more, and then some more…. At the end, I realized that I had wasted my time.

          • Craig Buckler

            Again, what were you trying to achieve and what browsers were you targeting?

            If you’re supporting IE6/7/8, there may be good reason to use something like jQuery 1.x (although I’d recommend progressive enhancement more lately).

            The more vanilla JS you write, the more you realize monolithic frameworks are rarely necessary.

          • Roman

            Yes, I’ve been supporting those versions so far. But jQuery helps write less code in a smaller amount of time for newer browsers too.

            Computers and networks become faster, but humans don’t.

          • Craig Buckler

            “But jQuery helps write less code in a smaller amount of time for newer browsers too”.

            I won’t say “never use jQuery” but try it. It doesn’t save as much as you think especially since you no longer need to translate between standard DOM and jQuery objects. The page weight and performance benefits can be considerable – in some cases, many hundreds of times faster.

    • In their entirety, Bootstrap’s CSS and JS are a whopping 29k minified and gzipped, and that’s without making a custom build with only the features you need, which would result in even smaller sizes. Yet people talk about Bootstrap like it’s the biggest problem when it comes to page size, which clearly isn’t the case.

  • Vote 539

    In my opinion, the thing that slows down my web browsing experience the most are what I call “social bloatware”. You know, the annoying AddThis sharing buttons, live Facebook and Twitter streams, and the like. I share some of the blame since I too have those on my web site. I have them coded up such that they are not downloaded until after all of the core content on my site is ready, which somewhat improves load time, but still doesn’t get around the fact that I’m making my users send an extra 24 HTTP requests and download 220 KB worth of content. (That’s just for one AddThis button and a “Follow me on Twitter” button!)

    Fonts are also a problem. I mean, is Arial or Verdana really all that bad to justify downloading another 100 KB of material? I have my doubts.

    Personally, my main web project is a single-page application. I am far above the average weight for JavaScript (close to 800 KB), about half of which is the Cloud9 Ace Editor. However, I am very good on the images front (less than 100 KB), and my total page weight sums up to 1.4 MB.

    • Craig Buckler

      I agree about social networking bloat. That single script tag looks lightweight but it loads a ton of resources and, in general, fewer than 1% if people use it.

      I agree Microsoft is heading in the right direction (although some would say Windows 8 was a lower point than Vista given the sales figures). But I still think the point about simplicity is fair. Apple strive for it. Microsoft less so.

    • Sp4cecat

      That’s a really good point and kinda highlights how you can’t bring a good web page down to the one metric. If a lot of the site is loading asynchronously while the user is engaged with a site that looks to be loaded, the UX doesn’t suffer. Similarly, a 500kb site consisting of 500 1kb files is going to feel worse than a 2mb site with say, three files ..

  • Hamid Mohayeji

    Actually this page had more than 170 requests and the size was almost 1.5 MB. Interesting :)

    • Craig Buckler

      And that’s considered “low”…

  • Ri

    You wrote Mb everywhere in your article which is a measurement of speed and stands for “megabit”. You are talking about megabytes in your article, which is a measurement of size, and it’s abbreviated MB. Your table’s abbreviations are also wrong, they should read KB, since Kb stands for is kilobit not Kilobyte.

    • Michał Ociepa

      Mb is just and eight of MB (like byte is 8 bits (well at least in most cases)) and therefore is also a measurement of size. What you thinking of is Mbps or Mb/s (megabits per second). Sometimes people drop the time part when it is easily deducted from the context, but still technically Mb alone is measurement of size.

    • Michał Ociepa

      Mb is just and eight of MB (like byte is 8 bits (well at least in most cases)) and therefore is also a measurement of size. What you thinking of is Mbps or Mb/s (megabits per second). Sometimes people drop the time part when it is easily deducted from the context, but still technically Mb alone is measurement of size.

      • Ri

        Fair enough.

  • evanmcd

    In my opinion, discussions about page weight are irrelevant without facts about download speed. Without knowing the ratio between page weight (and other factors like reduced http requests) and download speed, it’s silly to blame developers and designers for taking advantage of their customers ability to get more data over the same period of time.

    Based on the the Net Index from Ookla, average download speed globally on mobile has increased about 8% over the last 5 months. That’s across broadband and mobile.

    Optimizing for performance is always a good thing when done responsibly, but articles like this should focus on BOTH factors before recklessly blaming anyone.

    http://explorer.netindex.com/maps (then click GLOBAL INDEX).

  • Craig Buckler

    The average page took 7.1 seconds to load in 2012 yet, even with network, processing and browser speed increases, it’s increased to 10.4 seconds in 2015. (Source: http://www.soasta.com/blog/page-bloat-average-web-page-2-mb/).

    Sites are bloated. Why make excuses?

    • Your link doesn’t work, but average on what? That’s the whole point of what evan is saying. I’ve got a paltry 10 megabit connection and NOTHING takes 10+ seconds to load.

      • Craig Buckler

        Looks like the URL was cropped – try this: http://goo.gl/HSj0H1

        Perhaps that is the cause of the issue? You have a reasonably fast connection (which you consider slow) and presume that no one else has more limited bandwidth. The sites you load never take long so 2MB pages aren’t a problem. QED.

        It’s a like claiming pollution isn’t an issue because you drive a Prius.

        I don’t know what pages you’re visiting but to state nothing takes 10+ seconds seems unlikely. Even if it’s true, how can you justify a 2MB page weight? On 3G, it will take more than one minute to load.

        • That all depends on which 3G you’re using. Even the slowest should get more than .5Mb download, and the fastest now tops out at more than 21Mb. In the UK, as of Novemeber 2014, the average 3G speeds are in the 5-6Mb range. In the US it’s more likely you’ll suffer on Verizon and Sprint with about .6-.8Mb, but T-mobile and AT&T average more than 2Mb. So there’s no way 2MB is going to take most people in the developed world more than a minute to load. Worst case scenario should be about 40 seconds, but more realistically it’s going to be 10 seconds or less.

          And yes, with 10Mb, no pages take more than 10 seconds to load. 2MB would take 2-3, maybe 4 seconds tops. Most sites load in the 2-3 second range. Even the likes of Amazon, topping out at 6.5MB on the desktop takes about 3 seconds before the page is usable with some images still loading below the fold. Though I do believe they have a lighter version of their site for mobile.

          Which raises another point… so what if they sampled from half a million web pages for this survey? 90% of the pages on the internet make up sites put together by amateurs that are hardly ever visited by anyone. What portion of the survey is made up of the other 10% of the sites on the internet that 90% of people use every day? How many sites have a mobile version that is smaller than their desktop version? Without quantifying their data it’s too great a generalization to say that “the internet is too big and unusable” when that’s not the case.

          There are still lots of people on dial up. Does this mean we should always be catering to the lowest common denominator? I don’t think so, and clearly the corporate world doesn’t think so either. We’re moving forward and people need to keep up with basic technological advancements if they want to enjoy an optimal experience on all the sites they visit. If the technology isn’t available in someone’s geographical area, or they’re out and about on their .5Mb 3G phone, they’re not going to be using the internet like they might at home on their DSL connection, and people understand this and deal with it. Or they wait.

          • Craig Buckler

            I wish I was living in your magical 3G land!

            I have a reasonable 40MB fibre connection and can categorically state that a good proportion of sites I visit take more than 10 seconds to load. Remember your bandwidth is only one end of the pipe – the server and intermediate hops still need to cope with serving and relaying that content.

            The survey looks at the top half million sites from Alexa. People use those daily although it wouldn’t include any behind a restricted login such as Facebook (but that’s more of an app anyway).

            “Does this mean we should always be catering to the lowest common denominator?”

            A 2MB page won’t work well for people on dial-up, average mobile or ADSL connections. That’s hardly the lowest common denominator – it’s the majority of those on the web (in the West). Forget the ‘average’ speeds you see reported by ISPs – they’re rarely based in reality for the majority of people.

            At best, 2MB and 100 HTTP requests for a single content page is madness. At worst, it’s unusable and slows down the web for everyone. Why are you defending such inefficiencies?

          • My magical 3G land? What does that mean? I’ve stated clearly the facts about 3G networks in 2014 for Europe and the US. Those are average speeds for those network’s users as conducted by several testers.

            Of course a 2MB page isn’t going to work well for someone on dial up, but people on dial up would find a 500KB page slow. It’s 2015 and I don’t cater to people using 20 year old technology. As for everyone else, a 2MB page will work just fine. It’s far from unusable as shown by billions using such pages every single day. If these pages weren’t unusable these companies wouldn’t be making any money and would be out of business.

            And if you’ve got a 40Mb fiber connection no page is taking you more than 10 seconds to load. That’s a huge load of BS and you know it. It might be the most ridiculous thing you’ve said actually. There’s nothing wrong with the other end of the pipe for the top half million sites on Alexa, they’ll serve you the data as fast as you can accept it.

          • Craig Buckler

            What you clearly stated is mobile surveys from network providers. They rarely match reality for most people – certainly not me. Many pages take longer than 30 seconds on 3G. Many timeout.

            I stated 2MB pages “won’t work well for people on dial-up, average mobile or ADSL connections”. That’s a big sector even if the dial-up proportion is small or you don’t care about them.

            And yes, a fast connection doesn’t necessarily mean sub 10-second pages. The connectivity speed, concurrent visitors, geo-graphical location, payload size and other factors have an effect. Co-incidentally, a certain popular (top 25,000) site just took 40 seconds to appear – and I stopped it loading before it completed. Even a cache-primed refresh took 8 seconds on fibre.

            Perhaps you’re one of those lucky people living in a perfect connectivity bubble? Perhaps no page you ever visit exceeds a 10 second load time? Perhaps that’s a the problem: you can’t believe others have a worse experience?

            You should also read this: http://www.kryogenix.org/code/browser/why-availability/

            No one’s saying 2MB pages are unusable – but they are inefficient. Again, why are you defending them? What warrants a 2MB content page? In simple terms, if pages were half that weight, download speed would be twice as fast, systems would be more responsive and you’d get more conversions/sales.

          • No, I stated connectivity test from a variety of third parties, which you can find by Googling “3G network speed tests 2015”. If I’d quoted service providers the numbers would be much higher.

            Dial up not withstanding, 2MB pages work fine on average mobile and DSL connections of, say, 2Mb or better. I live in the south of Spain, which is generally not highly regarded for it’s blistering internet speeds. I had a 3Mb connection, that was upgraded to a 6Mb connection, which was upgraded to a 10Mb connection over the years, and I can say that I have never experienced Internet so slow that I closed a popular website before it loaded. 3Mb, is about 375KB a second, 6Mb is 750KB, and 10Mb is 1250KB. I don’t even get my full 10, it’s usually more like 8 max. Even when I was sitting on a 3Mb connection only getting 2Mb throughput (or 250KB a second), a 2MB page wouldn’t take more than 10 seconds to load.

            So a 2MB page for you, sitting on a 40Mb fiber connection, shouldn’t take you more than 1 second to load. Your maximum speed after all is about 5MB a second and you can’t possibly have that much latency. Your ping must be good, it’s fibre, not a satellite connection. If you’ve got multiple major sites taking in excess of 10 seconds to load, maybe you ought to talk to your ISP, because clearly there’s something wrong with your Internet connection.

            You keep saying 2MB pages are “inefficient”. How can you state they’re inefficient across the board? No doubt some are poorly coded pieces of garbage, with badly optimized images, and I’m not defending that, but with others that’s not always the case. 2MB might just be what’s needed to provided the experience a company’s users expect. As I and other have shown the vast majority of data transmitted on the Internet is made up of images. One of your own links showed about 65% average IIRC, but it’s often higher. Most sites I’m looking at are in the 75-80% range.

            We like images a lot. We like to see images on the news sites we visit, we like to see what we’re planning on buying clearly before we buy it, we like to share our images, and we also like really high res devices.

            Many sites also serve different experiences for people on desktop vs those on mobile (and I’m not talking about apps). CNN, for example, serves a home page right around 2MB. The same page loaded on an iPhone 6 is smaller at about 1.5MB. eBay’s home page is about 1.6MB on the desktop, but it’s less than 700KB when loaded on an iPhone 6! I can go on and on here as many of our favorite sites function in similar ways. So a 2MB home page on your desktop isn’t necessarily 2MB for someone on mobile. It could be less than half of that, thus making the experience a whole lot less inefficient and painful.

          • Craig Buckler

            Again, you appear to be saying you don’t have a problem? Great. Perhaps you live close to an exchange in a large town or city? But it doesn’t follow everyone has the same experience – regardless of their connection.

            Even then, your end of the pipe is a single part of the network. The problem I experienced the other day appeared to be a slow jQuery CDN – that could have been solved had the site loaded blocking JavaScript at the end of the page.

            Sites such as Amazon and eBay have smaller mobile payload. The reason: they have separate app-like mobile versions. How many companies have the time or budget to offer that? The majority of sites use RWD. Some will serve HD images to Retina screens so the mobile ‘sites’ are even larger!

            Remember 2MB is an average. Every developer could justify reasons why their pages are so large whether it’s images, a framework, fonts or whatever. There will be exceptions but 2MB is the norm. For content-only pages, that’s inefficient.

    • Sp4cecat

      If your site has a high ‘bounce’ rate and page performance is affecting engagement, you should be looking at ways to improve your user experience. One of the things to improve would be amount and size of page components, but also load order – making sure your ‘above-the-fold’ (an increasingly outdated term) isn’t blocked by other resources loading.

      Darwinian-style, sites with bad load times and high bounce rates get downgraded in Google rankings. It’s up to the site owners to optimise engagement (yeah I just said that) by improving the user experience in as many ways as possible. One of which will be the size of the site .. but may NOT be if a lot of the site loads in the background ..

      • Craig Buckler

        I don’t believe many of these sites load assets in the background or make other efficiencies give the size and quantity of the files. 2MB is 2MB. It’d be like using an elastoplast on a broken leg.

  • SuperPip

    A very timely article… Sick to death of waiting for ages for some browsers to load all manner of crap on web pages that I just want to look at… much of it tracking stuck. Firefox is particulalryt bad and increasingly bloated.. Oh for the days of a simple browser that didn;t try tol be all bells and whistles .. and just did the job.

    • Craig Buckler

      To be fair, I don’t think it’s the browsers which are to blame. Users will blame them, or their device or their ISP but the real culprits are those creating bloated pages.

  • What do you expect with the Retina display screen (image sizes) and also the common use of JS libraries and frameworks…

    I mist the days where the web had minor interactivity and animations… It was a proper UX without the extras.

  • LizzyBiz

    I’d love to take a look at some of your work and see where it falls in size. I have a feeling that this is a case of either the pot calling the kettle black, or performance highly outweighing design.

    • Craig Buckler

      Personally, I strive for minimal code. I rarely use bulky frameworks, libraries or images when there are alternative options. That’s possibly because I started coding during the dial-up days when adding quotes around attributes was considered wasteful! I wouldn’t say my sites and apps are the absolutely efficient but none reach 2MB per page.

  • davidcrandall99

    I get the point of the article, but…reading the comments… “and what about 100 requests? There’s little excuse for that.” This page makes 110. Complaining about 2MB web pages…this one is 1.5. One commenter described this post as a bit melodramatic, and I agree. As a practice, devs should definitely code for efficiency, but letting an arbitrary number in a day and age where it takes less than a second to download 10MB of data is really kind of…unnecessary, for a lack of a better word. I think the reason behind the weight is more important than the wait itself. I referenced this page being relatively heavy itself, but its namely due to avatars, fonts, and functionality necessary to make this page an interactive environment.

    Many designers and developers create experience that are highly visual and require more resources than pages that are purely driven by textual content. Limiting yourself to an arbitrary page size can limit the creativity for many web designers; and that’s a shame. Again, we live in a time where download speeds are becoming less and less relevant, making more room for interesting experiences, and that should be embraced.

    I do agree – code efficiently, make your content and applications as light as possible, but don’t limit yourself to a number.

    • Craig Buckler

      “in a day and age where it takes less than a second to download 10MB of data”

      Tell that to people using 3G. Or wifi in a hotel. Or network access at an airport. Or those living in a broadband blackspot.

      The problem is that 2MB is an average for content sites. Every designer and developer will be able to justify why their pages are so large. If it’s so reasonable, why are there so many new initiatives to take these bloated processes out of our hands? Think reading modes, Google AMP, Facebook Instant etc.

      No one’s suggesting an arbitrary limit but most of the page bulk is inefficient digital crap – not “interactive environments”.

  • DigitalPsyche

    “Many clients are wannabe software designers and view developers as the implementers of their vision.”

    This. The reason I got the heck out of web development.

    As a user, surfing on a tablet, and sometimes on my PC, is often horrendously painful with poor performance, frequent crashes and reloads, and mobile sites that lock you out of the main site and provide few of the features. I don’t understand how these other commenters can defend the current crappy state of the web. It’s bloated. Pages are too busy. Adblockers support this position, meaning that users are redesigning pages based on their own needs. I don’t begrudge revenue to a content creator, but if your ads, and poor site construction prevent me from viewing the content, adios!

    Performance is essential to accessibility. Device detection is pure laziness, an excuse for sloppy coding and delivery. Fluid and dynamic pages with proper structure can solve this. The server-related costs of web traffic used to necessitate streamlining and flexibility.

    I started in IT in 1998. I suppose I’m old fashioned. The web is not the same place as it was, often for the better. This issue, however, doesn’t even need to exist.