Is it possible to control the crawl speed by robots.txt?

Googlebot picture Googlebot · Oct 16, 2011 · Viewed 12.9k times · Source

We can tell bots to crawl or not to crawl our website in robot.txt. On the other hand, we can control the crawling speed in Google Webmasters (how much Google bot crawls the website). I wonder if it is possible to limit the crawler activities by robots.txt

I mean accepting bots to crawl pages but limit their presence by time or pages or size!

Answer

ZurabWeb picture ZurabWeb · Feb 3, 2012

There is one directive you can use in robots.txt, it's "Crawl-delay".

Crawl-delay: 5

Meaning robots should be crawling no more than one page per 5 seconds. But this directive is not officially supported by robots.txt, as much as I know.

Also there are some robots that don't really take in count robots.txt file at all. So even if you have disallowed access to some pages, they still may get crawled by some robots, of course not the largest ones like Google.

Baidu for example could ignore robots.txt, but that's not for sure.

I've got no official source for this info, so you can just Google it.