Should I disallow Googlebot from crawling slower pages?

As we know Website speed is a factor in ranking. On the pages, site users complex to return the users request, giving a slow page time. Should we not allow Googlebot to index these pages to improve my overall website speed.

Why remove the pages?

Find and fix the slow pages.

Google will be happier :slight_smile:


Firstly, each page is ranked on its own merits against the exact search query used. While the overall quality of your site may be a small ranking factor when it comes to separating two results of similar quality, having a few slow pages on your site isn’t going to have an adverse impact on your general search rankings.

The main question is, do you want your potential visitors to be able to find these slow pages directly from Google search? If the answer to that is yes, then let Googlebot crawl and index them. If the answer is no, for some reason, then exclude Googlebot.

Here’s the same question that was answered by Matt Cutts, check it here it might answer your concern.


I agree with you! Website can get new position and traffic on these pages after fixing their speed:slight_smile:

This topic was automatically closed 91 days after the last reply. New replies are no longer allowed.