404 errors for robots.txt, but the file is there

I have a number of sites all with robots.txt files in place. Last month, some of these sites started showing 404 errors for robots.txt, although only in small numbers, so I wasn’t too worried. This month, the number of 404 errors have increased considerably. In most cases, the 404 figure is 25-50% of the successful download figure; on one site, the errors are 1.5 times as numerous as the successes. Other sites continue to show no 404 errors for robots.txt.

I haven’t changed any of the robots.txt files over this period. Google isn’t reporting any errors and I can access the files at domain/robots.txt every time I try.

Please can somebody explain what’s going on?

(Apologies if this is the wrong forum for the question.)

I am well aware of that and my question is not about 404s for non-existent pages, which I understand perfectly well.

I am asking why some sites are suddenly producing hundreds of 404 errors for a page which does exist, which has not been modified in months and which I and most other users can access without problem.

Are the errors on /robots.txt or /someotherfolder/robots.txt? Can you see in the logs the user-agent that caused these errors?

All in all, if the only unexplained 404s on your site are on robots.txt, you’ve almost certainly got bigger things to worry about (if not, I want your life!) … so many sites don’t have that file at all, that no search engine is going to slap down your site for not having it.

The errors are on /robots.txt. I hadn’t thought to look for the user-agent, but I’ve checked the logs and they are almost all discoveryengine.com (discobot). I’ve checked the logs for the sites where I’m not getting 404s for the robots.txt and the same agent is visiting those, from the same IP, without any problem.

There are the odd one or two 404s for other existing pages, but nothing of any significance.

I’m not worried about this from an SEO point of view (hence I wasn’t sure this was the right place to post) - it’s just my usual paranoid panic when I find something happening on my sites that I don’t understand and can’t explain.:confused:

404 might be generated by old cache as for example G. is reporting 404 for dead links/pages
we’re deleting from our directory by dozens every week but it doesn’t mean we have problems.

You might experiencing the same condition and if you know there is nothing wrong with your site’s files,
then stop worry, get yourself cup of coffee and relax.

:slight_smile:

fastreplies