A site audit is a great way to ensure whether your website is fully optimized for your users. However, while conducting an audit, potential speed issues may arise for your real-time site visitors. Read more to find about crawl rate, crawl delay, and why and how you should slow down your crawl during a site audit.
Crawl-rate is the time elapsed between two separate bots sending requests to your website. Simply put, how many requests/second bot sends to your site while crawling it. For example, 10 requests per second. Hence, the higher crawl rate, the faster a bot crawls your website.
Crawl Delay, on the other hand, means to signal the bot to wait for some time before sending another request. Such directives are sent to the crawlers so that they don’t overload the server. They are basically signals to slow down your crawl.
The interpretation of the crawl delay directive varies across different search engines. However, the basic idea remains the same.
BacklinkSEO has 3 options available for slowing down your crawl during a site audit i.e. Default delay, follow robots.txt, and 1 URL/2 seconds.
The default delay option is what the name suggests: the bots will crawl your site at a default rate. By default rate, we mean that the bots will wait 1 second before crawling another page.
Now you might have a robots.txt file in your site. The file is also known as the Robots Exclusion Protocol. In theory, this protocol defines a roadmap for the bots to follow.
a road map of sorts, which each robot must follow. You can also specify the crawl delay in your protocol.
If you want our the BacklinkSEO bots to follow the protocol, just select the “respect robots.txt crawl-delay” option. The bots will do the rest.
Now if you don’t have such a directive in your robots.txt file, you can inform BacklinkSEO to crawl 1 page per 2 seconds. Doing so will significantly increase the length of the SEO Audit. However, here is why you should slow down your crawl during an audit.
Suppose you have a lot of pages on your site. You have linked many of the pages from the index. Therefore, a bot while crawling your site may generate an excessive number of requests in a short period of time.
Usually hosting resources are monitored on an hourly basis. Such high website traffic can deplete your hosting resources. Therefore, it is a good idea to slow down your crawl. To do so, you can set the crawl delay to 1-2 seconds. Thus, the bots can crawl your website at a moderate pace without causing peaks frequently.
Crawl-delay directive causes a site audit to take longer than usual. However, doing so would help the real-time visitors on your website facing less frequent speed issues.
Google does not support the crawl-delay directive. Thus, Google bots will ignore the directive. However, if you want to stop other malicious bots to crawl your website, you can lower the crawl rate from the Google Search Console.
Here’s how you can slow down your crawl rate in Google Search Console.
Similarly, Baidu also doesn’t support the crawl delay directive. As with Google, You have to log into Baidu Webmaster Tools to reset your crawl rate. The process is almost similar to that of Google.
Yandex supports crawl delay. For example, if you set a crawl delay to x seconds, the bots will wait x seconds before requesting another URL. You can also perform the same function in Yandex.Webmaster. The tool allows you to reset the crawl rate for any website.
The interpretation of crawl delay in Bing and Yahoo is a bit different from that of Yandex. If you set the crawl delay to x second, the search engines will divide 24 hours into x-second windows. Then, they will allow bots to crawl a maximum of one page within that window.