Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
213 views
in Technique[技术] by (71.8m points)

python - How do I stop all spiders and the engine immediately after a condition in a pipeline is met?

We have a system written with scrapy to crawl a few websites. There are several spiders, and a few cascaded pipelines for all items passed by all crawlers. One of the pipeline components queries the google servers for geocoding addresses. Google imposes a limit of 2500 requests per day per IP address, and threatens to ban an IP address if it continues querying google even after google has responded with a warning message: 'OVER_QUERY_LIMIT'.

Hence I want to know about any mechanism which I can invoke from within the pipeline that will completely and immediately stop all further crawling/processing of all spiders and also the main engine.

I have checked other similar questions and their answers have not worked:

from scrapy.project import crawler
crawler._signal_shutdown(9,0) #Run this if the cnxn fails.

this does not work as it takes time for the spider to stop execution and hence many more requests are made to google (which could potentially ban my IP address)

import sys
sys.exit("SHUT DOWN EVERYTHING!")

this one doesn't work at all; items keep getting generated and passed to the pipeline, although the log vomits sys.exit() -> exceptions.SystemExit raised (to no effect)

crawler.engine.close_spider(self, 'log message')

this one has the same problem as the first case mentioned above.

I tried:

scrapy.project.crawler.engine.stop()

To no avail

EDIT: If I do in the pipeline:

from scrapy.contrib.closespider import CloseSpider

what should I pass as the 'crawler' argument to the CloseSpider's init() from the scope of my pipeline?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

You can raise a CloseSpider exception to close down a spider. However, I don't think this will work from a pipeline.

EDIT: avaleske notes in the comments to this answer that he was able to raise a CloseSpider exception from a pipeline. Most wise would be to use this.

A similar situation has been described on the Scrapy Users group, in this thread.

I quote:

To close an spider for any part of your code you should use engine.close_spider method. See this extension for an usage example: https://github.com/scrapy/scrapy/blob/master/scrapy/contrib/closespider.py#L61

You could write your own extension, whilst looking at closespider.py as an example, which will shut down a spider if a certain condition has been met.

Another "hack" would be to set a flag on the spider in the pipeline. For example:

pipeline:

def process_item(self, item, spider):
    if some_flag:
        spider.close_down = True

spider:

def parse(self, response):
    if self.close_down:
        raise CloseSpider(reason='API usage exceeded')

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...