My website robot.txt file is open to all search engines few months back… and also my server folders permisions were open to all (we couldnt analyze it…its our bad) . And after few days we found crawling errors on google search console, that our website been crawling with unknown URLS. which are not in our database…but now we are secured our server permissions and robots.txt… But the problem is… still those unknown URLS are getting on google search console crawling errors. Can someone help us
Do you mean that your site was hacked, and now that you’ve cleaned it all up, Google is showing crawl errors for “bad” pages which no longer exist?
I had that problem after one of my sites was hacked. I assumed they would disappear quite quickly, especially as Google knew the site had been hacked and then repaired, but they didn’t. After several months, I eventually used the “remove URL” function in search console to get rid of them.
This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.