Microsoft To Reduce Cloaking Detection Checks from MSNBot

In a post on the Live Search Webmaster Central Blog it was announced that Microsoft has released a patch for their cloaking detector that “should significantly reduce the number of requests to a more acceptable rate.” The cloaking detector, which is an part of Live Search’s MSNBot crawler, checks to see if websites are providing different content to visitors than the provide to MSNBot to “weed out spammers”. The cloaking detector mimics a visitor coming to a webpage from the Live search engine. There has been a problem with detector causing it has been unnecessarily performing hundreds of checks a day on some websites, which uses server resources and fills website statistics with fake referrers. Due to similar problems Microsoft announced that it would make changes to the detector in December of 2007.

Microsoft also said the cloaking detector problem was “compounded by and also confused with” a new feed crawling function that was “overzealous in its attempt to crawl and provide up-to-the-minute results” and that a patch was released for the feed crawler. The new feed crawler is intended to “help provide fresh results” in Live Search. As part of the post Microsoft has asked webmasters to help them to discover content changes, saying:

You can do this via sitemaps and various meta properties per link or via RSS link to notify us about very important content. To prevent us from having to monitor lots of feeds often, we recommend aggregating content change onto a few feeds; adding the name, “Aggregate” somewhere in the feed name. We also suggest referring to them in robots.txt and your sitemap—both of which will help us detect them and their use.

Leave a Reply

Your email address will not be published.