Google Provides Greater Control of Googlebot Crawl Rate

Google has changed the crawl rate setting in Google Webmaster Center to allow for more control of the rate that Googlebot crawls a website. Previously, the crawl rate could set to normal, slower, and in some cases faster. The new setting allows for setting the crawl rate based on the number of request per second or seconds between requests. The range of values that are available “is based on our understanding of your server’s capabilities” and “may vary from one site to another and across time based on several factors” according to post on Official Google Webmaster Center blog. The post also says that setting the rate higher than the default value “won’t improve your coverage or ranking.”  Yahoo and MSN/Live Search already allowed setting the seconds between requests made by their crawlers with craw-delay directive in the robots.txt file.

Leave a Reply

Your email address will not be published.