4
4

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

Scrapyの非同期DBパイプライン

Last updated at Posted at 2016-10-13

この辺の情報がなさそうだったので。

http://shop.oreilly.com/product/9781784399788.do
唯一のScrapy本の「Learning Scrapy」本によると、
DBパイプラインは以下のような感じにするのがよいらしい。

通常の同期的に書くとブロッキングされるので、非同期で書く。
twistedにDBプーリングの仕組みが提供されているので、それを使う(DBAPI2インタフェースならどのDBでもよい)

import logging
from twisted.enterprise import adbapi
from twisted.internet import reactor, defer

class DatabaseWriterPipeline(object):
    @classmethod
    def from_crawler(cls, crawler):
        settings = crawler.settings
        return cls(
            settings.get('PIPELINE_DATABASE_ENGINE'),
            settings.get('PIPELINE_DATABASE_HOST'),
            settings.getint('PIPELINE_DATABASE_PORT'),
            settings.get('PIPELINE_DATABASE_USER'),
            settings.get('PIPELINE_DATABASE_PASS'),
            settings.get('PIPELINE_DATABASE_DB'),
        )

    def __init__(self, driver, host, port, user, passwd, db):
        self.dbpool = adbapi.ConnectionPool(driver,
            charset='utf8',
            use_unicode=True,
            connect_timeout=60,
            host=host,
            port=port,
            user=user,
            passwd=passwd,
            db=db,
            cp_min=3,
            cp_max=10,
            cp_reconnect=True,
        )
        self.logger = logging.getLogger(__name__)

    def close_spider(self, spider):
        self.dbpool.close()

    @defer.inlineCallbacks
    def process_item(self, item, spider):
        yield self.dbpool.runInteraction(self.do_replace, item)
        defer.returnValue(item)

    @staticmethod
    def do_replace(tx, item):
        # お好きなitemの設計とテーブルに合わせて
        sql = "INSERT INTO text_data (url, title, content) VALUE (%s, %s, %s)"
        args = (
            item["url"][:2048],
            item["title"][:512],
            item["content"][:65535],
        )
        tx.execute(sql, args)

書き込むテーブルは作っておいて、settings.pyにこんな感じで書く。

DATABASE_ENGINE = 'MySQLdb'
DATABASE_HOST = 'localhost'
DATABASE_PORT = 3306
DATABASE_DB = 'crawl'
DATABASE_USER = 'root'
DATABASE_PASS = ''

ITEM_PIPELINES = {
    'extensions.pipelines.DatabaseWriterPipeline': 100, # 上記のクラスを指定する
}

DBが詰まることなく高速に動く(もともとクロール側の負荷に比べてDBの負荷ははるかに少ない)

4
4
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
4
4

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?