I have just been into web master tools to find a message from Google which reads:
http://www.web-writer-articles.co.uk/: Googlebot can't access your siteJul 17, 2012
Over the last 24 hours, Googlebot encountered 2 errors while attempting to access your robots.txt. To ensure that we didn't crawl any pages listed in that file, we postponed our crawl. Your site's overall robots.txt error rate is 40.0%.
It goes on to recommend: Using Webmaster Tools, find a day with a high error rate and examine the logs for your web server for that day. Look for errors accessing robots.txt in the logs for that day and fix the causes of those errors.
The most likely explanation is that your site is overloaded. Contact your hosting provider and discuss reconfiguring your web server or adding more resources to your website.
The history behind this is I recently had to move my website from one server to another because the hosts server was compromised by a virus. This could have been at the time of the error?
Anyway as recommended in the message I contacted my host and they said everything was fine with the server and there were no issues now. I couldn't really argue - I didnt know what to ask for.
So now I'm a bit stuck and worried Google cant access the site. how do I know when it is accessing the site? also where do i find my error logs?
Is there a problem with the robots text file? It is as follows:
thank you for your time with this