2Mb Web Pages: Who’s to Blame?

My old website only used it for sticky navigation. So there’s that.

So it would be better to reinvent the wheel and write your own sticky navigation, is that what you and everyone else is saying. You think that is more practical and a better option than spending that time on other features which require custom engineering. Yeah… right.

I didn’t know jQuery or let alone vanilla Javascript.

It’s a lot easier to plug and place a jQuery script (more easily found) and include jQuery.

Noone said anything about it being more practical or better. Not sure where anyone has said that.

Using jQuery for ONLY a sticky menu qualifies – no?

You asked for this

I provided it.

Of course. I gathered you were in doubt that anyone includes jQuery for one/two “things”. I provided evidence that it exists. Simple as that.

1 Like

Also, if you load jQuery from Google’s or jQuery’s CDN, then there’s a good chance the user will already have it in their cache, and any worries about weight become moot. We could use jQuery for “just one thing” at no practical cost. I think it’s a mistake to focus on weight as if it’s the yard stick for performance. It isn’t.

Forrest from the trees.

I was looking for more less examples of sites that matter. Sites that have an actual audience, not just one off pipe dreams. I would challenge you find a site with a large audience, lets say over a million visitors a month that uses jQuery and/or other libraries that are only used for one or two “things”. As that is a more accurate representation of the “web” – not some one off site that hardely anyone visits.

They often do it because they don’t know better. They are using high-spec devices with fast connections, and they don’t realise that there are genuine and valuable users out there who don’t have that. They often tend to think that, OK so maybe there are people with low-spec devices or slow connections, but they aren’t going to be serious users, they aren’t going to spend any money, so it isn’t worth making an effort to make the website work for them. It’s better to give people who can get it a fancy-schmancy design that allow you to show off what you can do than to allow the possibility of anyone seeing anything that isn’t pixel perfect what you wanted.

It doesn’t help when you get gods like Google putting two fingers up to accessibility and compatibility, and flat-out refusing to serve anything other than a horrendously cut-down and featureless version to people not using the very latest browsers.

1 Like

It’s a big deal because there are a lot of people whose browsing is completely crippled by page weights of 2MB. When I’m using my mobile, at best I can get 3G in some places but a lot of the time I’m not getting that. All too often, I get pages that take so long to download that my phone gives up half-way, or have so much cruft and fancy stuff in them that they don’t work properly, because nobody checks pages on Windows Phone.

Are you saying that I don’t deserve to have a good experience on the web? Because that’s what it sounds like to me.

5 Likes

Yes, you’re not privileged to anything in that regards. The majority of an audience shouldn’t have to suffer nor should a client be limited in their vision simply because you choose to use a crappy phone and have a poor wireless connection. Though I would agree that developers should be testing on windows phones but that isn’t always practical. With the rise of browsers providing “mobile” views I do think developers have become a little lazy in regards to testing on actual devices – that I would agree.

Going back to the discussion at hand no one has provided any examples of sites that actually matter. This entire conversation revolves around generic, deductions for low audience websites it seems. Unless someone can prove me wrong without just choking down what they are told.

Let me set the record straight as well. I do believe sites should be optimized. What I’m against is telling a client no due to a limitation of a small percentage of an audience

Great. Then let’s just end this, because you’re not going to change our minds, and we’re not going to change yours. You go ahead and make your pages as heavy and bloated as you like - I’m sure your clients will love you for the fact that you’re ignoring those who aren’t privileged enough to have decent access to your sites. I’m going to keep optimising my sites/apps as much as is feasible.

Let’s just agree to disagree, and get on with life.

:slight_smile:

7 Likes

Can I go back and read the rest of the article now?

i think its good. The more people making slow websites the more my faster smaller website moves up the rankings :slight_smile:

I think the current trend of really big background images is partly to blame for the page weight. I’ve see a couple of sites recently with multiple 2mb background images. I’ve done similar layout and got the images to ~200kb each without loosing too much. When it goes to phone i switch the images with the css so they don’t load the big ones so page is even smaller for mobile connection. simples.

1 Like

I am not a coder, or expert, but I have done a few sites, most of them non-profit, nevertheless they would need some of the nice/fancy features of the big awsome looking sites, be it the fixed nav, with dropdowns, megamenus, or a simple responsive site.
To achive that I used zurb foundation, and bootstrap, and of course that could be considered contributing to the unnecessary bloat, the implementation of EU cookie directive too can be seen as that; just to give a couple of examples.
I did try to strip some of the css of the above frameworks, but it really doesn’t seem to be worth it.

Now, though I believe every drop does count in the ocean I would think the statistics above are heavily influenced by the big sites, be it news, social, or business.
Take a news site, now more than ever we want images, and videos, to support the news - hey we have the technology - all that does require more code and media support.

Look at this same forum, doesn’t it have more than the older system?

It is the same as the hardware/software development, e.g. 15 years ago I could run photoshop on a pc with 256/512 ram, now I need much more.

Like it or not this is the global consumistic society, so the issue is mainly the consumism, but probably some would say I am OT.

check this site. i didn’t build it. looks nice but 5.5mb!! http://blueprintforwater.org.uk/
seems to be massive images. I think they got paid quite a lot to build it too (we know the company who had it made)

1 Like

An EXCELLENT example! The five background images in that carousel, and the image further down come in at a whopping 5.46MB! And to be sure I wasn’t completely wasting my time, I reduced the screen to ensure the mobile experience was the same, and it was…

I took those six and even using a simple app like paint.net, I resaved them at 55% quality (I’m sure I could have come in lower) and without sacrificing image quality, the total file size reduced to 1.53MB. That’s a 28% savings without any major effort! And considering my photoshop elements at home will do file saves like this in bulk, that’s even less of a reason to accept something like this.

You’re right. This is pretty damn crazy.
Check out amazon.ca home page. 7.68MB with over 510 resources called.

Wow.

Sorry to double post, and eBay.ca is about 2.94mb with 172 resources on the home page landing…

TBH though, the latest browsers are free, and easy to download/install/update. People spend out loads of money to get the latest this, that and the other. When it comes to browsers, I think people should just update their browser or download a modern one. It holds back innovation when they don’t. I understand sometimes bad business decisions mean that in a place of work you can’t use a modern browser, but when big companies stop supporting old browsers it encourages those companies to modernise.

Users want the most brilliant products and services, for free, and on outdated software. I think it is a bit much to ask.

Some browsers are device specific such as Internet Explorer and Safari.