0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

コロナによる全国の閉店・開店数を調べてみた

Last updated at Posted at 2020-05-01

#  コロナはどれだけの店舗を閉鎖に追い込んだか?
開店閉店.comというサイトで、全国の開店、閉店した店舗の情報が集計されている。スクレイピングの勉強を兼ねて、2020、2019年の3〜4月における閉店数の割合を県ごとにカラーマップしてみよう。

# 方法
上記サイトで、各地域の各期間に該当する店舗数をカウントしていく方向性でいこう。幸い、地域別にカテゴライズされており各ページはdescending orderで保存されているのでそれを利用。また、スクレイピングはpythonのbeautifulsoup4を使っている。

# 結果・考察
先に結果から見ていこう(実装は以下)。以下のような結果になった。sidebarの数値は、各県において
      ratio = $(N_{開店} - N_{閉店})$ / $(N_{閉店} + N_{開店})$
で規格化されている。

2020.png
2019.png

こうみてみると、コロナウイルス による有意な差異はみられない。単純に影響が少ないとみるか、時間差で影響が出てくるかはこれからの動向をみてみなければばならない。どうにか被害が少なくなることを願う。

# 実装
まずデータを上記urlアドレスから収集する。

shop_openup_closedown_ratio_1.py

from bs4 import BeautifulSoup
from urllib import request
import datetime
import numpy as np

def period(year,month,year_s = 2019,year_e = 2019,month_s = 3,month_e = 4):
    res = False
    if (year<=year_e) & (year >=year_s) & (month>=month_s) & (month<=month_e):
        res = 2
    if year >= year_e:
        if year > year_e:
            res = 1
        else:
            if month>month_e:
                res = 1

    return res



def main(year_s = 2020,year_e = 2020,month_s = 4,month_e = 4):
    dic = {}
    states = ['close','open']
    for state in range(len(states)):
        url = 'https://kaiten-heiten.com/heiten/area-' + states[state] + '/'  
        response = request.urlopen(url)
        soup = BeautifulSoup(response,'html.parser')

        for a in soup.find_all('a', class_="links"):   
            link = a.get('href')
            region = a.text 
            print(region)

            if dic.get(region) is None:
                dic[region] = [0,0]



            url = link
            response = request.urlopen(url)
            soup = BeautifulSoup(response,'html.parser')

            shop_list = soup.find_all('span', class_='post_time')

            year_last = int(shop_list[-1].text[:5])
            month_last = int(shop_list[-1].text[6:8])

            for a in shop_list:        

                year = int(a.text[:5])
                month = int(a.text[6:8])


                if period(year, month,year_s,year_e,month_s,month_e) == 2:
                    dic[region][state] += 1

                cout = 0
                flag = 0

                while period(year_last, month_last,year_s,year_e,month_s,month_e)>=1:

                    next_p = soup.find('a',class_='next page-numbers')
                    
                    if soup.find_all('a',class_='next page-numbers') is not None:
                        link = next_p.get('href')
                    else:
                        break

                    url = link
                    response = request.urlopen(url)
                    soup = BeautifulSoup(response,'html.parser')


                    shop_list = soup.find_all('span', class_='post_time')

                    year_last = int(shop_list[-1].text[:5])
                    month_last = int(shop_list[-1].text[6:8])


                    print(year_last,month_last)

                    for a in shop_list:        

                        year = int(a.text[:5])
                        month = int(a.text[6:8])


                        if period(year, month):
                            dic[region][state] += 1

                            
    regions = list(dic.keys())
    vals_ = np.array(list(dic.values()))

    hk = int(22)

    hk_name = ['北海道']
    hk_vals = np.array([vals_[:hk,0].sum(), vals_[:hk,1].sum()])

    regions = hk_name + regions[hk:]
    vals = [list(hk_vals)] + list(vals_[hk:])


    # ratio == N_o - N_c / N

    ratio = np.zeros(len(regions))

    for i in range(len(regions)):
        ratio[i] = (vals[i][1] - vals[i][0]) / (vals[i][1] + vals[i][0])
    
    return regions, vals, ratio, vals_


regions_2020, vals_2020, ratio_2020, vals_2020 = main(2020,2020,3,4)
regions_2019, vals_2019, ratio_2019, vals_2019 = main(2019,2019,3,4)

でデータを取得。

shop_openup_closedown_ratio_2.py

import numpy as np
import cv2
from PIL import Image
import matplotlib.colors
import matplotlib.pyplot as plt
from japanmap import *



def mapping(regions,ratio,name,a=0.1,b=1):

    n_min = a
    n_max = b

    cmap = plt.cm.rainbow
    norm = matplotlib.colors.Normalize(vmin=n_min, vmax=n_max)

    def color_scale(r):
        tmp = cmap(norm(r))
        return (tmp[0]*255, tmp[1]*255, tmp[2]*255)

    dic = {}
    for k in range(len(regions)):
        map_val = color_scale(ratio[k])
        dic[regions[k]] = map_val

    lab = name + ' 3~4'
    fig = plt.figure(figsize=(15,9))
    plt.title(lab,fontsize=15)
    plt.imshow(picture(dic))


    sm = plt.cm.ScalarMappable(cmap=cmap, norm=norm)
    
    plt.colorbar(sm)
    plt.show()
    
    fig.savefig(name)

でデータを整え。以下でプロット。

shop_openup_closedown_ratio_3.py

a = list(ratio_2020)+list(ratio_2019)

max_n = max(a)
min_n = min(a)
mapping(regions_2019,ratio_2019,'2019')
mapping(regions_2020,ratio_2020,'2020',min_n,max_n)


データソース

開店閉店.com

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?