Google Analytics - sudden loss of links?

Hi all,

Somewhat puzzled what to think.

I login to my webmaster tools today via Google Analytics as I normally do every couple of days just to check up on things and see what Google suddenly reports my site has only 35 found links.

It’s been 1585 for months now and then this.

My question is just what’s happened? These links haven’t disappeared, they’re still out there just that Google Analytics no longer thinks they exist.

Has Google suffered a database mishap?

As a result at least the index page as dropped in SERP’s but is still on the first page for its search term. It’s page rank has also gone up by 1.

Very puzzling indeed.

Thanks all,

Might be possible it is showing because of bug… wait for few days it would get updated.

Thank you for your replies. I actually meant Google Webmaster Tools but gladly you guys knew that from the get go.

I’m going to leave it a few days and see where it takes me. Just yesterday my website moved up 2 SERP spots on the first page and today fell one again, still +1 up from what as three days ago.

I can’t believe it’s the content. Maybe Google wasn’t impressed by some links coming from forums, but it’s also removed links from some other sources that aren’t forums and aren’t my doing so…time will tell.

Thanks again.

This is probably because of the new Google Panda update that has taken place. The whole algorithm has been slightly tweaked. Quality Content is king when indexed backlinks are concerned. You might have been slapped by a penalty from Google. You never know this because Google does not notify you on the Penalties that you get. Wait for a few more days and stay away from spamming because Google is getting smarter by the end of each Day.

Thanks for your insight.

I’m not a spammer as such, if I post a link in a forum it’s because it helps answer the question of the original poster so the link is relevant.

I just hope it’s the Panda update.

I’ve added image alternative text in some places so describe the images to Google’s bot, can’t see how this would perceived as wrongdoing given I only use small and relevant words (which just happen to be relevant to the theme of the website due to the nature of the images) and there’s no keyword stuff, I don’t even have a meta keyword tag given how much I’ve read this gets ignored by Google nowadays anyway.

you never know what you get penalized for in the case of Google. I think you should wait for some time and see whether there is a change in the rankings. If not, i would be happy to help you out!

well what is your website so we can check your backlink profile? Webmaster tools takes into account your site links too.

OPensiteexplorer or yahoo site explorere are better tools for backlinks

Yahoo’s tool was a good one for checking backlinks. They have stated its going to be closed by the end of the year.

I’ve always liked the SEOMoz Open Site Explorer better anyway.

Just checked Webmaster Tools again today and the links have gone up now from 35 to 880. This would suggest it was a database crash that Google experienced a few days back? …or maybe it just reset it’s bot to go and pick-up the links all again but this using an improved algorithm?

This also happened to one of our sites. When I logged in to Analytics, I saw that the links suddenly became 3. I hope I know the answer but I don’t!

I can only think of two rational answers:

  1. Google had a DB crash
  2. Google reset its Google Bot to go and re-discover backlinks from new as a possible way of validating which are still active

I guess if someone attempts black hat SEO that might be an option too.

Google has changed how websites and blogs will rank. Google has been waging war against all kinds of search engine spam and especially. Did you check “HTML suggestions” in Google WT?

Oh, man this might be due to the Panda update. I don’t know what the hell is happening. Check for backlinks you might be having links from spam sites.

Yes, indeed. Google Webmaster Tools only suggests two title HTML changes and both concern my “About Us” and “Contact Us” pages. Everything else it likes.

Strangely though (even if it is a separate topic really) it keeps having the habit of detecting what it thinks are invalid on-page links (as oppose to backlinks hence aka separate topic), usually these are 301’s and it thinks my user friendly name links are “not found” when I test them regularly and they work just fine (i.e redirect to the ugly affiliate links). If Google bot is taking these friendly names and sticking a “www.” then of course they’ll be not found but why on earth would it do that is beyond me.

It also keeps saying other links aren’t found from a site such as “http://nibbler.silktide.com/reports/” because it’s getting confused and reads links from “http://nibbler.silktide.com/reports/www.<my domain name here>.com” instead of just focussing on my sitemap.xml and the links it finds when sniffing my own HTML pages. If it did the latter it wouldn’t be making these errors. Somewhat strange how Google coded their bot to rely on a 3rd party to teach it whether on-page links are valid rather than knowing that if you want the truth you should get it straight from the horses mouth (i.e. the website owner thus first and foremost the accompanying sitemap.xml).