1
2

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

SpiderのConfigを実行時に変更する。

Posted at

#概要

前提

ScrapyのSpiderでは、設定ファイル(settings.py)やカスタム設定(custom_setting) でクローリング設定ができる。
ただし、この設定だけで済まそうとすると各設定(クローリングする深さ、優先度...)ごとにSpiderを作ることになり非常に管理コストがかかる。

本記事の内容

Scrapyのコマンドを調査したところ解消したので、記載する。

#やりたいこと
Spiderのソースコードを変更せずに、以下の設定を切り替える。

  • クローリングする深さ(DEPTH_LIMIT)

#追加引数
「-s DEPTH_LIMIT=2」を追加することでクローリングする深さを、コマンド実行時に設定できる。

cmdline例
crawl sample_crawler -s DEPTH_LIMIT=2

#コード例

Spider

Spiderのコード

spider.py
import scrapy
from scrapy.spiders import Rule
from scrapy.linkextractors import LinkExtractor


class SampleCrawler(scrapy.spiders.CrawlSpider):
    name = 'sample_crawler'
    allowed_domains = ['www.crawler-test.com']
    start_urls = ['https://www.crawler-test.com/']

    custom_settings = {
        'DEPTH_LIMIT': 1
    }
    rules = [
        Rule(
            LinkExtractor(
                allow=('/mobile/',),
                deny=('/redirects/',)
            ),
            callback='parse_test1'
        ),
        Rule(
            LinkExtractor(
                allow=('',),
                deny=('/test1/',)
            ),
        ),
    ]

    def __init__(self, *args, **kw):
        print('custom_settings: {}'.format(self.custom_settings))

        super(SampleCrawler, self).__init__(*args, **kw)

    def parse_test1(self, response):
        print('response url: {} depth: {}'.format(response.url, response.meta.get('depth')))

実行ログ1

Overridden settings: でsettingが表示される。

run.log
2020-10-30 23:04:21 [scrapy.utils.log] INFO: Scrapy 2.2.1 started (bot: ScrapySample)
2020-10-30 23:04:21 [scrapy.utils.log] INFO: Versions: lxml 4.5.2.0, libxml2 2.9.5, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 20.3.0, Python 3.6.2 (v3.6.2:5fd33b5, Jul  8 2017, 04:57:36) [MSC v.1900 64 bit (AMD64)], pyOpenSSL 19.1.0 (OpenSSL 1.1.1g  21 Apr 2020), cryptography 3.0, Platform Windows-10-10.0.18362-SP0
2020-10-30 23:04:21 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor
2020-10-30 23:04:21 [scrapy.crawler] INFO: Overridden settings:
{'BOT_NAME': 'ScrapySample',
 'DEPTH_LIMIT': 1,
 'NEWSPIDER_MODULE': 'ScrapySample.spiders',
 'ROBOTSTXT_OBEY': True,
 'SPIDER_MODULES': ['ScrapySample.spiders']}
2020-10-30 23:04:21 [scrapy.extensions.telnet] INFO: Telnet Password: ed92b4e8e23a16a0
2020-10-30 23:04:21 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.logstats.LogStats']
custom_settings: {'DEPTH_LIMIT': 1}

実行ログ2

実行コマンドの引数を変えて、再実行。
追加した引数: "-s DEPTH_LIMIT=2"

run.log
2020-10-30 23:08:46 [scrapy.utils.log] INFO: Scrapy 2.2.1 started (bot: ScrapySample)
2020-10-30 23:08:46 [scrapy.utils.log] INFO: Versions: lxml 4.5.2.0, libxml2 2.9.5, cssselect 1.1.0, parsel 1.6.0, w3lib 1.22.0, Twisted 20.3.0, Python 3.6.2 (v3.6.2:5fd33b5, Jul  8 2017, 04:57:36) [MSC v.1900 64 bit (AMD64)], pyOpenSSL 19.1.0 (OpenSSL 1.1.1g  21 Apr 2020), cryptography 3.0, Platform Windows-10-10.0.18362-SP0
2020-10-30 23:08:46 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor
2020-10-30 23:08:46 [scrapy.crawler] INFO: Overridden settings:
{'BOT_NAME': 'ScrapySample',
 'DEPTH_LIMIT': '2',
 'NEWSPIDER_MODULE': 'ScrapySample.spiders',
 'ROBOTSTXT_OBEY': True,
 'SPIDER_MODULES': ['ScrapySample.spiders']}
2020-10-30 23:08:46 [scrapy.extensions.telnet] INFO: Telnet Password: 22caac9b06a21a64
2020-10-30 23:08:47 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.telnet.TelnetConsole',
 'scrapy.extensions.logstats.LogStats']
custom_settings: {'DEPTH_LIMIT': 1}
1
2
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
2

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?