Robots.txt / Crawling Error

Hello
Friends!

I
have a problem and I need help, I have 3 websites on the same IP address and
one site working properly on search engines and Google crawl it properly time
to time and no get any warning messages from google about this website in
webmaster tool. But other 2 site receive a warning message:

“When
we tested a sample of the URLs from your Sitemap, we found that some URLs were
not accessible to Googlebot due to network timeouts. If this problem persists,
please check the network availability of your DNS and web servers. All
accessible URLs will still be submitted.”

Anyone
help to get resolve this problem and tell me what I should do to solve it?

Hello,

I think you are facing below problem regarding sitemap,

  • Your Sitemap does not appear to be in a supported format. Please ensure it
    meets our Sitemap guidelines and resubmit.

OR

  • Some URLs listed in this Sitemap have a high response time. This may
    indicate a problem with your server or with the content of the page.

To solve this issue kindly contact with your DNS provider of websites.

Hope this solution will help you,
Thank you

http://www.graphiera.com/sitemap.xml my website can you check and tell me what problem with sitemap

According to the message you quoted in your first post, there is nothing wrong wit the sitemap itself; the problem is that Google is unable to access some of the URLs because they were timing out. If the URLs listed in your sitemap are correct, then you need to address this issue.

May be because of the Server issue, the problem persists still. Kindly check with the server also

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.