Confirmed: Google Uses Site Speed in SERP

We’ve all seen this coming for some time and Google has finally confirmed it: site speed will be used to rank your website. Don’t go too excited though, site speed is affecting less than 1% of search results:

While site speed is a new signal, it doesn’t carry as much weight as the relevance of a page. Currently, fewer than 1% of search queries are affected by the site speed signal in our implementation and the signal for site speed only applies for visitors searching in English on Google.com at this point.

well… i find that great that you can get higher in search results by optimizing a few tens of parameters. Sometimes you are not able to optimize one or more parameters (money or time needed). But if you can make the site better in other ways and it will add to ranking - that’s nice!

This is more of a scripting/database issue and less of a big images or big flash files issue… but I’m guessing thats the next step.

The stupidity of Google never ceases to amaze me. First, they wrecked the significance of incoming links with “nofollow”, they tank websites for no reason (Sandbox) and now they are going to punish webmasters on shared servers because their sites load a little slower than the big boys with big money to spend on dedicated servers?

It seems that many of the commenters on the article do not seem to think this is a good idea.

“All things being equal I would prefer to visit a fast loading site versus a slow one when I search for something.”

Well I would prefer to find a site that has the information that I’m looking for. I would not mind waiting a bit longer to retrieve those information. What good is a fast loading time if the site is offering useless information?

This will only lead to attacks on popular sites to slow them to a crawl and to spammers using fast static sites to serve their contents.

Really? Too bad the “geniuses” at Google couldn’t figure that one out.

I’d say that the huge army of “SEO” blog comment spammers are to blame for that one :slight_smile:

You’re looking at this in black and white where as Google is evaluating websites based on a sliding scale.

  1. The significance of incoming links was not destroyed with the introduction of “nofollow”, rather it was simply made easier to control. As bluedreamer said, spammers abused the idea of incoming links so Google created a way for honest webmasters to combat them. If someone links to your website and places a “nofollow” tag on the link then they probably didn’t want to pass any link juice your way in the first place. Think about it this way, the link without the rel=“nofollow” will pass link juice, which is the default. Google made webmasters purposely add something to the link in order to negate PageRank.

  2. We (outside of Google) don’t know what the load time benchmarks are for being punished or rewarded. It could be anything. That being the case, how can you say that websites which load “a little slower than the big boys” will be punished? You don’t know that and you haven’t defined how much slower “a little” is. What Google is saying is that speed is important, so don’t neglect it. While speed may (not always) play a factor in the ranking I have a hard time believing that Google is going to use page load time to abolish a website that serves up unique and quality content.

The lesson here is that if your website is actually useful then Google is still going to display that. However, if there’s a nearly identical website out there that loads faster then that faster website will probably rank above yours. The question must be asked though, if that’s the case, why are there identical websites? It would serve no purpose to the user to have two identical websites and at the end of the day Google is there to serve the user.

So, cheesedude, quit complaining about the fact that Google has introduced another reasonable factor into their already successful algorithm and use the knowledge to your benefit. If your website is loading within a reasonable amount of time but still not quite as fast as “the big boys” you’re still going to be okay – that is provided your website actually provides relevant content that the “big boys” website doesn’t.

Looks like Google is looking for around a half-second load time for ecommerce websites.

It was just a matter of time for Google to use site’s load speed in serp, I mean, if people do not like lazy websites, why should Google like them? :slight_smile:

And a lot of webmasters put “nofollow” on links they put on their own sites. I do. I cannot be the only one.

Anyone else put nofollow on outgoing links they place on their own sites?

The speed which a website loads is 100% irrelevant to the quality of the information it contains. With 4.5 years of experience on shared servers, I can tell you that any server can have occasional slow-downs. I used to be on an overseller and at peak times forum page loads were taking 30 seconds. Naturally, I switched to a better web host. But webmasters should not be penalized for things beyond their control. I have looked at page load data in the Google webmaster tools and while most page loads are 2 - 3 seconds, there have been times where they have taken much longer. It doesn’t change the content I provide.

It’s just a bad idea. I think Google has other motives. I would not be surprised if they start buying web hosting companies to cement more control over the web.

The commenters on the Google blog article feel otherwise. So do I.

You were the one who originally shared the link with everyone here at Sitepoint. They are using it for some purpose.

Possibly. But maybe not as high. I am sure you know full well that landing on the first page of the SERPS is going to bring you much more traffic than being on the second or third.

Great. My sites get hit by scrapers from time to time. A big money webmaster scrapes my site, steals my content, changes it just enough to make it not exactly the same, puts it on a faster server, then he outranks me? Is that fair?

Do you define identical as being the exact same word for word or containing the same information? Would you want to live in a world limited to only one website about toenail fungus?

That’s what the duplicate content filter is supposed to be about.

Successful? There aren’t a whole lot of search engines out there.

Speed is irrelevant.

Actually, Google is setting the target for e-commerce sites at 2 seconds. Google’s own target for its search engine is a half-second.

Otherwise, studies by Akamai claim 2 seconds as the threshold for ecommerce site “acceptability.” Just as an FYI, at Google we aim for under a half-second.

I think 2 seconds is insane as a target for an e-commerce site. I have a simple front page with no graphics, 3 simple MySQL queries to load latest topics, 45 kb of text compressed (about 200 kb uncompressed) and a 10 kb CSS file and from DNS lookup to content download it took 1.3 and 1.5 seconds on two tests on a shared server with 2 CPUs and a load average at test time of under 0.50. That is as fast as it is going to get because that is a low server CPU load. The only way I could make it faster would be to rewrite the entire site to cache every single page and regenerate every single page when the content changes.

It isn’t Google’s business how fast websites load and it is irrelevant to the quality of the sites.

I am just interested how this speed factor in ranking will be used for websites which are not submitted to Google Webmaster Tools.

I guess, for such websites this factor will not be applied. The indirect implication of this is that website owners with very heavy websites might want to abort their Webmaster Tools. Buy, Google.

Why would it matter if a website is attached to Google Webmaster Tools? To the best of my knowledge this is not a factor in any ranking criteria including site speed.