I’ve done a 503 with a Retry-After: 86400
on the xxx.com
and created a dev.xxx.com
with a robots.txt of Disallow: /
Will I have a problem with this? Will this effect future SEO? I don’t know much about SEO and my past attempts at it have not been good. A couple ended up in irreversible situations.
###How can I temporarily suspend all crawling of my website?
You can temporarily suspend all crawling by returning a HTTP result code of 503 for all URLs, including the robots.txt file. The robots.txt file will be retried periodically until it can be accessed again. We do not recommend changing your robots.txt file to disallow crawling.
Edit: I’m really not sure where to put this topic.