How can I make scrapy crawl break and exit when encountering the first exception?

Udi picture Udi · Mar 1, 2012 · Viewed 10k times · Source

For development purposes, I would like to stop all scrapy crawling activity as soon a first exception (in a spider or a pipeline) occurs.

Any advice?

Answer

tokarev picture tokarev · Mar 8, 2016

Since 0.11, there is CLOSESPIDER_ERRORCOUNT:

An integer which specifies the maximum number of errors to receive before closing the spider. If the spider generates more than that number of errors, it will be closed with the reason closespider_errorcount. If zero (or non set), spiders won’t be closed by number of errors.

If it is set to 1, the spider will be closed on the first exception.