Why Page Weight Could Wreck Your SEO Efforts

I recently wrote the article “Why Page Weight Still Matters”. My concern is that the broadband revolution has caused some developers to forget efficiency and file optimization. An increasing number of sites have pages which approach several megabytes in size, yet some emerging markets only support low-bandwidth connections.

Have you addressed a page weight problem on your site? If you didn’t heed my warning, perhaps you’ll take notice of the bombshell Jen recently dropped: page weight now affects your Google search engine position!

Google now considers the time it takes your website to load. Fatter pages result in slower downloads — and site speed is a factor which affects your page rank. According to the Google blog:

Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we’ve seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don’t just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that’s why we’ve decided to take site speed into account in our search rankings.

Take a deep breath and don’t panic. Google’s site speed algorithms have been running for several weeks. If you haven’t noticed a change in your search position, your site is probably fine. For now.

Site speed carries far less (mathematical) weight than other factors such as relevancy, topicality, and reputation. Google estimate that fewer than 1% of queries will change as a result of the new algorithms. However, a simple fact remains: your competitors may achieve a higher position if their pages are leaner than yours.

It may be a good time to rethink your boss’s request for another 15 adverts or a 100KB JavaScript library to handle roll-over effects. If your site has become a monolithic monster, perhaps you should consider the potential harm to your business. Buying a faster server or increased bandwidth is just putting off the issue — your pages need to go on a calorie-controlled diet.

See also: 9 Causes of Web Page Obesity.

Are Google right to consider page weight and site speed in their algorithms? Will it have an impact your website? Are you taking steps to reduce your bandwidth requirements? Please leave your comments below or vote using our new poll on the SitePoint home page…

Free book: Jump Start HTML5 Basics

Grab a free copy of one our latest ebooks! Packed with hints and tips on HTML5's most powerful new features.

  • mahen23

    heheh all my sites are plain XHTML/css so i do not worry

  • http://www.optimalworks.net/ Craig Buckler

    @mahen23
    So you’ve got no images, adverts, JavaScript libraries, etc? I’m sure you’ll be fine!

  • mahen23

    @Buckler of course there are images, and some javascript, but hey are very minimal. Bandwisth here in Mauritius is very low, so i need to compensate…

  • henrikblunck

    I fully agree with the intention to keep your pages as functional in as little download time as possible. I see it as proper coding to leave all excessive fat off the pages and to ensure that all coding validates. That will make them as user-friendly as possible for as many people as possible, and need not take up any extra space.

    The issue of images has been mentioned, and here we should also mention that images can have lower resolution when just part of a webpage compared with photos for downloads for media printing. When you look at big companies they often have thumbnails and the high quality image is something you download by clicking that very thumbnail.

    So imagining that everyone sits in front of a 21″ high resolution iMac when many will need to watch your website on an iPhone tells it all. Your thinking should always be focused around functional webpages in as little space as possible.

    Thanks for a good blog. :-)

  • Peter

    I don’t usually bash Google but I am now considering starting to do so.
    They can refer to about usability testing all they want, but if this measure leads to more relevant heavy pages getting a lower rank than slim irrelevant pages (where content is secondary to experience), then Google is setting off down the entertainment road rather than staying on the information road.

    AFAIK, so far it has all been about trying to rate content in various ways and then combining to a single rating. That was good, but rating speed is bad. Speed is already implicitly included in the Google score since if users are unhappy with the speed of a site they are less likely to link to it, and if they are rating the content highly they will link to slow sites as well and that will reflect in the Google score, and accurately so.

    Use a parallel rating system instead, because that allows for speed and content to co-exist as parameters for choosing among search results. It’s almost as if Google think they should do the actual choice for us instead of giving us the possibilities. I mean, seriously, whoever uses the “I feel lucky” button as something else than than a gadget. Now they want to force that button on us?

  • 2MHost.com

    so what the best online speed test and website analysis? I believe Yahoo have one, but I can’t find it now!?

  • NetNerd85

    Page weight? they have gone insane.

  • http://01-global.net 01globalnet

    As I read this article there is no real “evidence” that Google calculates page weight for rankings.

    But if they do so, do they have right algorithms on place? How a page is loading fast? Does it have with the contents (robots do not load css, js etc…) or with the webserver? What if for a specific period my webserver is slower to respond because of a digg effect on another website on same server??

  • topdown

    For those asking for tools,
    There is Yahoo’s YSlow for FireBug http://developer.yahoo.com/yslow/
    and I use Google’s Page Speed plug-in for FireBug
    http://code.google.com/speed/page-speed/

    Also the Net tab in FireBug can give you in site on how many HTTP requests your page makes which has a big play in page speed.

  • Anonymous

    Google has essentially cut the Achilles heel on designers. This new ranking system will limit design substantially. Another hard hit for flash designers.

  • http://www.skylight-studios.co.uk tdsmithj

    I think it might only consider the actual amount of code shown on the page and not the size of the elements in the code?

    for example Google might read an image tag in HTML not the actual image itself. Plus taking speed into consideration could be a number of factors besides the website such as peak times, web server load etc etc. I assume Google takes a number of readings over a specific time and base it on a percentage before determined whether to take action.

  • http://www.lunadesign.org awasson

    Anonymous: Google has essentially cut the Achilles heel on designers. This new ranking system will limit design substantially. Another hard hit for flash designers.

    Nonsense. It’s about time some serious thought was given to the amount of bloat in web pages. Go look at CSSZenGarden and see at what amazing design can be accomplished through creative use of CSS and a few kb of content/images.

    And Flash… Flash has always been limited from an SEO perspective because the working fla file is compiled into an swf which provides no access to the content within. That said, there’s no reason why you can’t create a lean multi-swf site where the homepage swf is very light and you load new swf’s whenever a navigation item is selected. It will penalize is those sites that take a minute and a half to load because the developer didn’t split a large site into smaller bites (bytes).

    I guess this also changes to the table vs css debate…

  • http://www.optimalworks.net/ Craig Buckler

    @awasson
    Agreed. The only ‘problems’ caused will be for developers who have never considered page weight before.

    I certainly don’t think it imposes unreasonable limits. Some web pages have reached multiple megabytes, but it’s rarely because they are especially ‘creative’. It’s a slightly ridiculous situation when you consider that a browser download isn’t much bigger.

  • Peter

    I am surprised everyone is talking about the design aspects only. Look at the forest beyond the trees.

    Am I the only one that is concerned that Google, whose success has been based upon finding the content we look for (more or less), now is going to try to tell us which content is better – based, not upon a rating of the content (always tentative of course yet historically acceptable) but upon, also, how the content is accessed?

    Sheer poppy-cock, I say!

    First the Kindle/iPad/etc foreboding of an internet age where content gets locked into hardware, and thus where what we can access depends on our hardware. Now the mega-indexer wants to start sorting through search results based upon such an irrelevant factor (in relation to information) such as speed?

    Don’t get me wrong, speed is important, but if a slick site wins over an informative site, that should be up to the users, not the search engines. Google has finally gone megalomanic.

  • http://www.sassquad.com sa_scott

    This isn’t really a big deal – but it is a wake up call to anyone who has taken for granted broadband speed hiding the bloat of websites. Time and time again, I come across sites with outrageously large graphics files. It’s really not hard to spend a little time in Fireworks or ImageReady optimising your graphics!

    The recent news on the Indian government’s plans to introduce broadband to it’s entire population within the next 2 years is pretty ambitious. It also highlights how much of the world does not have a fast connection. I’ve seen for myself on a dial up connection in Malaysia five years ago, how long it took to download bloated sites.

    I commend Google for introducing this feature. It’s time to fight that flab!

  • http://www.optimalworks.net/ Craig Buckler

    @Peter
    I don’t think it’s quite as bad as you describe. Site speed has a far lower influence than relevancy. What it does mean is that if you have two sites with similar content and inbound links, the lighter-weight one is likely to be ranked higher. Smaller weight = faster download = more ‘useful’ to the user.

  • adam.hopkinson

    I expect Google are using data from the Google Toolbar to measure site speed – they have been using this in Webmaster Tools (under Site Performance) for a while, and it is the only method to accurately measure end-to-end speed, from the initial browser request right to the completion of images, footer javascripts and so on.
    Yes this would mean that users’ connection speed is a factor on site speed, but this is what is being measured. Given like-for-like content, when I search for a term from my portable device – or on a particularly slow fixed connection – I’m most interested in the site that loads the fastest.

  • http://www.rwtconsultants.com israelisassi

    I have to wonder if this decision was somehow affected by Google getting into the ISP business..

  • http://www.austin-it-consulting.com austince

    It would make sense in the overall of design or usability of your site if it loads up faster. I remember from other sitepoint forums that if the user has to wait more than 4 seconds, they typically click to something else. Plus, in your Google analytics report, they tell you what your user bandwidth limitations are so this news should not be a shock to the system.

    Yahoo Developer network has good break down how to speed up your site:
    http://developer.yahoo.com/performance/rules.html

  • Rich

    2MHost.com There’s a Speed Analyzer built into Firefox’s Firebug Add-on. I think you can access info in Webmaster Tools | Labs too.

    The fact that Google have included these metrics and measuring tools does suggest it is going to have an impact on ranking, however, designers should be ensuring pages load quickly for the visitors benefit anyway, shouldn’t they?

  • Man Ray

    It is true that page weight doesn’t affect the rank of a site that much, but that doesn’t mean that you should neglect that it still counts as a factor. As an SEO Reseller, I’ve always advised my clients that whatever factor affects Google’s decision on page rank, you must not ignore it. Although, no one actually knows the algorithm that Google uses to compute the ranks, but that doesn’t mean that you shouldn’t work on getting all the known factors of it down. You should religiously consider every possibility to get ahead while staying on the lines of legality.

  • http://www.virvo.com Web-Development

    I dont understand why everyone is making such a big fuss about page weight. If its only going to have a 0.05% change in rankings…. why does everyone think its a big deal? Im not saying it doesnt matter at all, I just dont understand all the fuss.

  • http://www.optimalworks.net/ Craig Buckler

    @Web-Development
    Where did you find the 0.05% change in rankings figure? No one knows exactly how sites will be affected, but Google state that 1% of sites will have their position modified as a result. That’s still a huge number of sites.

    And how will it change the result? A move from #500 to #1,000 may not be noticeable, but a move from #1 to #2 could have a massive impact on sales.

    Many SEO techniques are difficult to quantify. The good thing about site speed is that it’s understandable and benefits everyone. Reducing page weight is great for your users, your hosting costs, and (possibly) your page rank.

  • http://www.austin-it-consulting.com austince

    I would be interested in where you got the .05% stat also. I do agree that there is no magic bullet to the SEO process but some basic principals should be followed:

    1) Good coding
    2) Proper MetaTags
    3) Page weight
    4) Good content

    After this, you have a solid foundation to build off of.

  • Adam

    Yes – I agree with this. There was something recently on another SEO forum I follow about Google pushing certain results down the rankings for slow response time. I think their thought was, a slowly responding webpage is a poor user experience. They want to find the more relevant results (which is what we all focus on) but also they want a positive experience for their end-user so they come back to use their search engine again… We found this compelling enough that we shot something out to our </a href=”http://hubshout.com”>SEO reseller</a> mailing list saying: Hey – Keep your server running smooth. This is another good article along those lines. We may reference it in a future blast.