1
5

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

BeautifulSoupで複数ページをスクレイピングする

Posted at

急遽また複数ページにまたがるデータをデータベースに保存しないといけない要求があり突貫工事で書いたわけです。CSSセレクターは死ぬほど便利ですよね。

#実態

scl.py
import requests, os, re, csv, bs4
import sqlite3
import lxml.html

a = 0
i = 0

url = 'https://www.〜'

while a < 55:
    a += 1
    
    res = requests.get(url)
    res.raise_for_status()
    soup = bs4.BeautifulSoup(res.text, 'lxml')


    for u in soup.select('.plan-module > .plan-link.plan-image-container'):
        urls = 'https://www.〜' + u.attrs['href']

        #print (urls)

        con = sqlite3.connect('url.db')
        c = con.cursor()
        c.execute('''CREATE TABLE IF NOT EXISTS urldata(urls unique)''')
        c.execute('INSERT INTO urldata VALUES (?)',[urls])
        con.commit()
        con.close()

    i += 1
    url = 'https://www.〜?=' + str(i)


print ('success')

ところがどっこい、ページ送りが動的要素でSelenium使わないと駄目なことが判明した次第です。

1
5
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
5

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?