How to avoid the crawl error

Hi Friends Kindly share your ideas regarding how to avoid the crawl error in webmaster. It often comming into my site. So kindly give perfect solution to avoid this problem

Do your web page the right way? Put a robots.txt so the spider know what to crawl and what not to crawl?

Those are the things that come to my head right now.

1 Like

Stand up?

2 Likes

Have you looked at Google’s help pages? https://support.google.com/webmasters/answer/35120?hl=en

There’s not really much else we can say without more information. Which crawl errors are you experiencing on your site? Have you made recent changes which might have caused problems?

Reducing the website loading time is the perfect solution.
Check your website with Google developer page speed insights.
Get help from your designer and developer, Optimize the website based on the page speed insights suggestions.

Score above 90% and you can later see the crawl errors fixed.

So you’re saying that less than optimal page speed is directly related to crawl errors?

Please provide a link to the Google documentation that explains this relationship.

Getting score above is also one of the way to minimize the crawl error’s. (In my practical experience). I don’t have any theory links about that.

Crawling errors are normal for a large website like e-commerce portals.
Solution:
Increase your crawling rate
Creating custom 404 page
Using of Redirection based on the error

GWT - Crawl stats - Time spent downloading a page
400-600 average is good for a smaller website
700-1400 average is good for a larger website
This can be achieved when the website is ultra fast.

@cosmichq

If the site is the one mentioned in your profile I would first validate the site and remove all the HTML errors followed by removing the HTML warnings. This may also allow Google Chrome to actually render the site. FireFox and Opera are more lenient and displays the page OK. I use Linux and unable to test IE but dread to think what it will be like.

My guess is that your experience with reduced crawl errors has less to do with page speed, but more to do with your serving a page for bad HTTP requests. i.e. you are returning a “found” so the search engines are none the wiser. Might this be the case?

Yes, you got my point.
Our ultimate aim is to reduce the crawl error… :smile: in anyway

Surely the aim is to discover what’s causing the crawl errors and fix it. After all, if Googlebot is having difficulty navigating your site, your visitors may well be experiencing the same. And as I said earlier, without knowing the kind of crawl errors @cosmichq is experiencing, it’s really not possible to offer sound advice on solving them.

2 Likes

Speed is only an issue if the page load time is too high. It only needs to be reasonable to avoid a crawl error (a page load speed error that is).

Crawl errors are more often caused by dead links, site errors, malformed html and unreliable servers (server being down when the search engine tries to crawl the page).

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.