google-site-verification: googlebc47d07320294fb4.html

Googlebot crawl rate tool in Search Console is going away

Google is deprecating the crawl rate limiter legacy tool within Google Search Console on January 8, 2024. Google said this feature is no longer useful because Google said it has improved its “crawling logic and other tools available to publishers.”

What is the crawl rate limiter. It is a tool within the legacy version of Google Search Console that lets you communicate to Google crawl your site less than it currently does. Google has historically recommended against limiting the crawl rate unless you are seeing server load problems that are definitely caused by Googlebot hitting your server too hard.

You can access the tool over here, until it is removed. Here is what it looks like:

Alphabet Inc.

Why Google is removing it. Gary Illyes from Google said, “with the improvements we’ve made to our crawling logic and other tools available to publishers, its usefulness has dissipated.”

“Googlebot reacts to how the site–or more specifically the server handling the site– responds to Googlebot’s HTTP requests. For example, if the server persistently returns HTTP 500 status codes for a range of URLs, Googlebot will automatically, and almost immediately slow down crawling. Similarly, Googlebot slows down automatically if the response time for requests gets significantly longer. If you do experience unusually heavy crawling that your site can’t manage on its own, refer to this help article,” Gary Illyes added.

Illyes explained that the rate limiter tool within Google Search Console “had a much slower effect.” He said the tool often would “have taken over a day for the new limits to be applied on crawling.”

He added that the tool was “rarely” used by site owners and those who used it “in many cases set the crawling speed to the bare minimum.”

Crawl rate change. Google said with the deprecation of the crawl limiter tool, Google is setting the minimum crawling speed to a lower rate, comparable to the old crawl rate limits. So Google will “effectively continue honoring the settings that some site owners have set in the past if the Search interest is low, and our crawlers don’t waste the site’s bandwidth,” Illyes added.

What if you have issues. If you have issues with crawling, Google said you can read this help document and use the report form to let Google know.

Why we care. If you have been using this crawl rate tool, keep in mind it will be going away. So set a notification on your calendar to see what, if any, impact this has on your server when the feature is turned on.

The post Googlebot crawl rate tool in Search Console is going away appeared first on Search Engine Land.

Original source:

+ +