I hope I am in the right area.
I am in charge of a website where I am concerned about the robots.txt.
Ahrefs says 503 Service unavailable
But the file exists and can be accessed via the browser. Permission is 644
Does anyone know the problem?
If you are sure the file exists and is accesible, I don’t think you have a problem.
The server was maybe having a bad day when Ahrefs visited.
You mention robots.txt in the title, but not the question. Do you have anything in that which may be restricting robots from the page?
Thanks for your replay.
You can find the robots.txt here: https://gutachter-noyal.de/robots.txt
I don’t know if it is a problem. Ahrefs can’t access since weeks so ahrefs can’t do any audit.
There is nothing in roboat that should cause a problem. A
site: search shows three pages indexed on Google, which is what your sitemap shows, so no issue there.
It looks like more an Ahrefs problem.
Okay thank you, then it is what it is.
HTTP 503 may mean a rate limit was hit. Do you know if you have a rate limit anywhere in your site, either in the code, the webserver or anything in front (like a load balancer)?
It wouldn’t surprise me if Ahrefs hits such a limit, as their bot is pretty poorly coded.
No nothing like that. just now it worked again when i started the crawl manually