Website Down When Bot Visit?

Hi All,
My website was ranking on the 3rd page for my targeted keyword and from last week i witnessed that it was no where in the top 30 pages (300 results). Last week my website was down for half a day due to some hosting provider problem. Is this a factor for affecting my result? If so how come can i overcome this issue? Kindly help me sort this problem…

Google is realistic and knows that sometimes sites can go offline temporarily – it is very unlikely that it would penalise you or drop your ranking for a brief and isolated instance. I’ve had my website go offline for a couple of days once and didn’t notice any major drop in search ranking or traffic a couple of days after it came back online, so I wouldn’t expect half a day of downtime to cause any lasting damage. But if your site is persistently unavailable, Google will start to treat it with suspicion – it wants to send people to the best site out there, and the best site out there isn’t going to be one that returns a server error one time in three.

Thanks Stevie for your quick thought!

[FONT=verdana]Just to add to Stevie’s good advice:

If your site is registered with Google Webmaster Tools (GWT), you can always check to see what problems the bot had in crawling your site - and when. If you look at the time-line under Crawl Errors, you should see a spike for the day on which the site was unavailable. All being well, you should then see the graph get back to normal immediately afterwards. If you are not seeing that, then there’s something definitely wrong.


If your site was ranking in the thirties for a target keyword, it wouldn’t take a whole lot for an algorithm change to tumble it. Put in abridged terms, serp rankings are logarithmic in terms of competition i.e. 1 site will be “100%”, 10 will be 99%, 100 will be 98%. Losing 1% will not move you from 37th to 38th, but to 380th.