Can i control google crawler access

Please help me with information about crawler access. I want to know, what crawler access rate is best for better indexing. Can i control frequency of crawler access? I developed some sites and i followed every step that a seo friendly site should be. I am taking look at search engines changing title tag, meta description and other facts. Bing and yahoo take effect quickly except then google. So my concern is about google, is there any process to take quick effect for google?

Thanks

Hi cmpranto and welcome to the forums. :slight_smile:

Within Google Webmaster Tools, under Site Configuration > Settings there is an option to set a custom crawl rate. The notes say:

Our goal is to crawl as many pages from your site as we can without overwhelming your server’s bandwidth. You can change the crawl rate (the speed of Google’s requests during the crawl) for sites at the root or subdomain level - for example, www.example.co.uk and http://subdomain.example.co.uk. The new customised crawl rate will be valid for 90 days.

Robots.txt files and Robots Meta tag are helpful at some extent for Google crawler access. Some Examples:

To stop robot for under construction pages

<meta name=“robots” content=“no index,no follow”>

To call robots to your site for indexing

<meta name=“robots” content=“index,follow”>

Hi Vilishost, and welcome.

TechnoBear has told you how to change the crawl rate. But doing that won’t necessarily mean that Google’s search results will reflect changes to your site any faster. Regardless of how often Google crawls the site, there can still be a delay before it’s reflected in the results. However, a faster crawl will generally mean that entirely new pages show up sooner.

The main reason to change the crawl rate is if you’re worried that the crawler is using too much bandwidth.

Also, robots.txt isn’t really relevant here. You can use robots.txt to prevent pages being crawled, but as far as I know you can’t use it to change the crawl rate.

Mike

You can allow Google crawler to crawl your site by robots.txt file. You can see the Crawl stats Googlebot activity in Google Webmaster Tool under Diagonistics>>Crawl Stats, Diagonistics>>Crawl Errors

Off course you can set crawl rate for website within Google webmaster tool you will see the option under Site Configuration> Settings there is an option to set a custom crawl rate for setting crawl rate for website but we shouldn’t do it and it should be according because crawler won’t come on your before crawl rate.

Use xml sitemap to control your crawler rate.

You can control the Google crawler from accessing your entire website or specific web pages but you can’t control the frequency of accessing through robots.txt
However, the question may raise then why the sitemap has the option for crawling frequency? The answer is that this option in sitemap is used by very rare and old web crawlers that is what I guess. Modern Bots they are used to crawl the websites based on the content update. Google bot also won’t follow the controls you set in Google webmaster tools for frequency of crawling. This is only be possible in the way how often you updating the content with your website, this is what I believe to intimate the Google bots how often pages to be crawled.

You can check the crawler information in webmaster tool of google.