SEO Disasters: What Happens When Google De-Indexes Your Site

By Kerry Butters

It’s been a tough year or so for many SEO companies – particularly those that don’t follow the rules when it comes to black hat techniques, especially with regard to links. I’ve always been of the opinion that quality wins out and while you can tweak for SEO, chasing links through poor content is always going to raise some Google eyebrows or worse – a manual penalty.

The major algorithm changes that the search engine has made in recent years with regard to content have meant that many companies have found their sites pretty much disappearing from the index. Why? Because they have not followed the rules, either deliberately or accidentally.

Panda, Penguin and Hummingbird, Google’s most recent famous algorithm updates, have changed the game. They demand high quality, both within site architecture and content. What’s more, one sniff of purchased links and it’s highly likely that a site will be de-indexed. Sometimes, this is the fault of an agency that uses a somewhat less-than-ethical approach to SEO. In other cases, it’s down to a site owner’s lack of knowledge, or their making simple mistakes.

White-hat SEO is the only approach to take unless you want to risk a penalty. This doesn’t just apply to link building, it also applies to other SEO techniques such as:

  • Cloaking/Gateway pages
  • Spun content
  • Purchasing links
  • Duplicate content
  • Keyword stuffing

Many webmasters still use cloaking and gateway pages as a means to climb the SERPs, but it’s really not advisable. Don’t just take my word for it, have a look at what Google’s Matt Cutts has to say:

In January Matt Cutts also put an end to using guest posting as a means to building back links for SEO.

Okay, I’m calling it: if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Why? Because over time it’s become a more and more spammy practice, and if you’re doing a lot of guest blogging then you’re hanging out with really bad company.

So while guest posting can still be used as a means of promoting content and Public Relations, it can no longer be used solely to build links. The spammers ruined it for everybody.

If you get a penalty

According to Google, you could be at risk of a penalty if your site:

[D]oes not meet the quality standards necessary to assign accurate PageRank. [Google] cannot comment on the individual reasons your page was removed. However, certain actions such as cloaking, writing text in such a way that it can be seen by search engines but not by users, or setting up pages/links with the sole purpose of fooling search engines may result in permanent removal from [the] index.

A manual penalty will show up in your Webmaster Tools account. Depending on the cause of the penalty, you can clean up and re-submit the site to Google. This isn’t necessarily an easy job, though, and it’s important that you get it right. Given that shady links are a very common reason that you may receive a penalty, let’s have a look at how you can approach this.

First of all, Google recommends that you use Lynx, a text-based browser that allows you to view a site in the same way that a search bot does. However, do bear in mind that you won’t be able to view scripts, session IDs, Flash and so on (just as a search bot can’t).

Another option is to use [Open Site Explorer](http://www.opensiteexplorer.org/) to examine links. This free tool (part of the [Moz](http://moz.com/) suite) lets you export your inbound links as a .CSV file, which is useful in order to examine your links and begin to address the problem.

Getting organised

Before you begin, it pays to get yourself organized by setting up a spreadsheet containing all your link information. This gives you a record of the changes you’ve made, which is helpful when submitting a re-consideration request.

Your spreadsheet should include:

  • Links to and from URLs
  • Anchor text details
  • Contact details for the linked from sites
  • Removal/nofollow requests sent to sites including dates and number of requests made
  • Link status (removed/nofollow/live)

Removing Bad Links

Before you can remove ‘bad links’, you will have to determine which ones are doing the damage. To help you to do this, the SEOgadgets tool has the best reputation of the many bad link tools out there and allows you to upload 200 links at a time that have been saved as an Excel/CSV file. You will have done this by using Open Site Explorer as mentioned above, or if you prefer, you can download links directly from Webmaster Tools.

The tool uses special algorithms to decide which links are safe and which aren’t. It takes its link information from SEOMoz, which is a trusted source, and then scores them. The tool attempts to find contact info for each link and gathers other information such as anchor text, social media details, Authorship and link metrics such as Google Pagerank and SEOMoz domain authority. Best of all, as well as being from a trusted company, SEOgadgets is free, unlike many of the other tools out there, some of which charge $10-20 per link – that’s a lot of cash if you have a lot of links to work with.


On the next screen, you can wait whilst the tool analyzes each link, or you can enter your email address and ask to be notified when the report has been generated.

You can then export all of the infomation from the tool and act on the bad links by contacting the site owner/webmaster and asking for the link to have a nofollow placed on it or removed completely. Do take care, as when I tested the tool I did find that it pointed to links as being bad that I know are not, so check out sites manually too.

Finally, you can also use the Google Disavow tool if you really have to, but unless you’re very conversant with SEO, it’s not recommended. The tool is there for those links that you absolutely have no way of cleaning and you’re sure that are having a detrimental effect on the site’s ranking. These must be submitted to Google as a .txt file and contain one link per line.

It must also be UTF-8 or 7-bit ASCII encoded and you can add further information explaining why this link is being disavowed by using the # symbol to indicate that it’s a descriptive line, rather than a link. Using the tool instructs Google to ignore those bad links you can’t get removed, but don’t expect instant results, it can take a while for it to have an effect.

Before Resubmission

Before you appeal to Google and look at resubmitting the site, you should also take a look at the following:

  • Site architecture
  • The use of keywords throughout the content
  • What’s included in meta descriptions and titles
  • That no keywords are used in meta keywords

Site architecture should follow a logical pattern and include internal linking. Keyword density throughout the site should be measured and, if necessary, lowered to ensure that it reads naturally. Keywords remain a well-loved SEO tactic, but you should think of one or two key phrases, rather than using one word over and over again. Phrases that are similar and relevant to each other also work well.

Meta descriptions and page titles should all reflect the content of the page and be different for each page on the site.

Links are not the only things that may have got you a manual penalty. You should also look at:

  • potential malware attacks/injections
  • poorly written or ‘thin’ content that offers nothing of value
  • hidden text (as in meta info/cloaked pages/black text on black background etc.)
  • user-generated spam in comments or community areas of the site
  • ‘spammy freehosts’

Free hosting services are unfortunately often related to spam as they appear to attract low-quality sites.

The tricky part

OK, so you’ve done all of the link-cleaning you can, you’ve checked the site over with a fine tooth comb and moved hosts if necessary. Now it’s time to ask Google nicely if it will reconsider indexing your site.

You can do this directly from the Manual Actions page in Webmaster Tools. In your reconsideration request, you should include as much information as possible in order to, as Matt Cutts puts it, offer:

clear, compelling evidence [to make it] easier for Google to make an assessment

If unnatural links caused the penalty, make your link-tracking spreadsheet available as a Google doc, and share it within your request.

You should also detail the reasons you wound up with unnatural links. If you’ve used an agency, inform Google. If it’s a member of your staff that put the links in place, tell Google what you have done to ensure that the mistake won’t be made in the future. It’s your job to convince the search engine staff that this is something that will never happen again. The more information you can give to support your request, the better. Leave no stone unturned and create an argument that your site, and its administrators, can be trusted 100 per cent to make sure there’s not going to be a repeat of the problem.

If you’ve been thorough, looked at all unnatural links and cleaned up every aspect of the site, then there’s no reason Google should refuse. If they do, you can appeal, but make absolutely sure that the site is clean first.

It’s not pleasant getting a penalty from Google and if your site has a lot of backlinks, it can be a laborious and drawn-out process. However, it’s got to be done if you want to maintain a web presence, especially if you’re in business.

The best way to avoid getting a manual penalty is to ensure that you do everything according to the rules, keep a keen eye on the sites that are linking to your content and forget chasing links as a form of SEO. Instead, build relationships with editors and sites in your niche if you want to write for other sites and want them to link back to yours. Use a bio and sign up for Google Authorship and make sure that you offer quality content – don’t insist upon a link either, or it will be clear that it’s all you’re really after.

SEO is getting tougher now and this is a positive step as far as I’m concerned. It makes for a better web for all of us, one that’s more useful and where high quality work can rise to the top.

  • Packards King

    Thanks for the post Kerry… this article is like a complete guide for SEO..
    Does social bookmarking and directory listing cause any harm to my site? Do u think bookmarking my site in certain poor sites might cause a negative effect?

  • http://markitwrite.com/ Kerry Butters

    Thanks for reading :) I think that poor quality sites should be avoided, but social signals are becoming increasingly important so unless you spam it should be fine. It’s the same for directory listings, ensure that if you’re a local business especially that citations etc are correct and keep everything as natural as possible. By this, I mean make sure that any directories and social bookmarking are highly relevant to the industry that you’re in and keep it that way, don’t just list wherever you can. Some people believe directory listings to be dead, but done properly they are still useful, especially for local SEO, just think white hat at all times.

    • Packards King

      Thanks for the reply..

      Well.. I dont spam ofcourse… I just bookmark the links as i post new articles on my site…
      But it seems I should be more careful about the directory listing from now…

  • Sarah Roscoe

    Excellent article, thanks.

    • http://markitwrite.com/ Kerry Butters

      Pleasure, thank you for reading :)

  • http://www.rankgiant.com/ www.rankgiant.com

    This should encourage marketers to take a look at the company they keep
    and consider whether those associations and partnerships are purely

    • http://markitwrite.com/ Kerry Butters

      Yes I agree, I think that especially in light of the guest blogging situation, marketers and SEOs have to be more vigilant than ever.

  • http://markitwrite.com/ Kerry Butters

    I wasn’t of course suggesting for a second that you do spam :) Yes, I think care is necessary these days, Google’s making it a rocky path aren’t they?!

  • Jonty Rhodes

    It was a great article, thanks for sharing it.

  • Dingo

    Good article… but:

    “Do take care, as when I tested the tool I did find that it pointed to links as being bad that I know are not, so check out sites manually too.”

    What’s the point of it, then? If you automatically know which are good and which are bad, do you really need the tool?

  • amit d

    Excellent Article, well done.

  • TmWe

    “What’s more, one sniff of purchased links and it’s highly likely that a site will be de-indexed.”

    Can you give some clear and authoritative examples of this.

  • Damien James

    Algorithms change too often i got hit with a pure spam notice the past few days. All my content is hand written up to 2000 words of content. It’s not just gibberish it’s based on facts and information. There is no cloning or copying. I tested with copyscape i have one page that hits 6% but that’s because the page it links to the guy just spammed my keyword all over his page. I have little over 200 clean back links that point to niche related sites maybe the odd few might be a border link but the odd few cant cause this kind of hit.. I don’t operate in any spam manner. I think it was down to a change of my home page. There really is nothing to clean up so im unsure as to why i got hit. Funny thing is there is clearly pure spam sites out there that have been out there longer than I have and still going strong. It’s a pain in the ass even though your blog was page 1 2-3 for 6 months and then bam de index. I wish they were more informative instead of a vague explanation. Yes it could be 1 of 100 reasons cheers google.

  • Asik

    Stick to the rules! That’s what the author is making a point about. It’s pretty understandable, the rules are everywhere and Google search engine isn’t an exception at all. What I’m kind of upset about is the issue regarding the re-indexation. If I am too convincing in my request, the website has been checked hundred times, why Google would refuse? That’s the question. I am trying to look for info on deindex.pro.



Learn Coding Online
Learn Web Development

Start learning web development and design for free with SitePoint Premium!

Instant Website Review

Use Woorank to analyze and optimize your website to improve your website to improve your ranking!

Run a review to see how your site can improve across 70+ metrics!

Get the latest in Entrepreneur, once a week, for free.