So, why haven’t I blogged it to death myself? Because it tells us nothing about what Google is doing today, and very little about what they might do in the future. Read the SEOmoz discussion and the patent, and you’ll see that the patent covers just about every possible way that historical data might be collected and used.
If Google decides to reward new sites which acquire a lot of backlinks in a hurry, because great new resources tend to get a lot of links in a hurry, that would be covered under this patent. If Google decides to penalize new sites which acquire a lot of backlinks in a hurry, because that’s also what link spammers do, that would be covered under this patent.
So it goes with just about any historical data they might look at… the problem is discerning the “good guy” cases from the “bad guy” cases. Even though Google’s decision makers may have no idea what they want to do, they file a patent anyway, since that’s what you do when you pay hundreds of people to invent stuff for you.
One of these days, if I really feel like making some old friends into enemies, I’ll take on some of the utter nonsense that’s bouncing around about the “Google Sandbox.”
It seems clear enough that new sites aren’t getting the same trust they used to get on Google, transient links (link rentals, RSS, press releases) aren’t having the same impact they used to, and excessive use of the same anchor text in incoming links (the usual pattern for “text ads” aka link rentals) can have a negative impact on a page’s rankings.
So far, that’s all I can see going on. Are some sites experiencing long term penalties for overdoing it with link spam? Sure they are – but that’s not news, it’s how Google has operated since day one.
Are more sites experiencing this now, as compared to a couple years ago? Sure, because more site owners have fallen for the hype and temptation of link rentals, and Google has responded by getting more aggressive about filtering. More people are complaining about penalties, because more people are spamming, and more innocent folks are getting caught in filters intended to catch spammers.
When I hear about a site that’s ranked #1 on MSN and Yahoo, and used to be #1 (but is now nowhere to be found) on Google, I don’t need to squint over a patent to know that they got caught with their hand in Larry Page’s cookie jar. So far, there are a lot of “theories” about how to recover from something like this, but I remain skeptical that any of them will be effective. Google has no reason to restore a link spammer to good graces, and a lot of technical obstacles in the way even if they wanted to.
So, having said all of this, someone will still ask me “what should I do to avoid getting sandboxed?” How about building links naturally, by continuously seeking to get links to your web pages from other web pages that link to web pages like yours? Put another way, keep working on quality links from relevant sites that reach your target audience – it’s a worthwhile activity, and will pay off for you even if the major search engines all switch to 100% pay-per-click tomorrow.
For a minute, just forget about Google, forget about patents, forget about reverse engineering. Look for good sites that link to sites like yours. Give them a reason to link to your site. Repeat. Repeat. Repeat.