26
20

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

PythonでSSHトンネル経由でMySQLに接続してクエリ結果をスプレッドシートに展開する

Last updated at Posted at 2018-10-11

DBのデータを自動でスプレッドシートに展開したいが、セキュリティの都合上DBに直接続できないのでSSHトンネル経由で接続してから展開するという方法をやってみた記録。

事前準備

Google Sheets APIの設定を行い、JSON形式のシークレットキーを取得してサーバーに設置しておくことが必要。

環境

  • CentOS Linux release 7.4.1708 (Core)
  • Python 3.6.5

接続確認済み

(2018/10/24)

  • さくらのレンタルサーバー
  • Amazon RDS

コード

sshtunnel.py
# CSV
import io as StringIO
import csv
from pprint import pprint
 
# Google Spreadsheet
import argparse
from apiclient import discovery
import oauth2client
import httplib2

# MySQL
from sshtunnel import SSHTunnelForwarder
import mysql.connector
import pymysql.cursors

# MySQLへのクエリー結果をもとにCSVデータを作成 
def sql_to_csv():
    # クエリー結果の出力先ファイルオブジェクト
    csvfile = StringIO.StringIO()
    writer = csv.writer(csvfile, dialect='excel')
 
    # MySQL に接続
    with SSHTunnelForwarder(
        ('ホスト名', ポート番号),
        ssh_host_key=None, # SSHホストキーがある場合は指定 (2018/10/24)
        ssh_username='SSHユーザー名',
        ssh_password='SSHパスワード', # もしくは鍵ファイルのパス、なければNone (2018/10/24)
        ssh_pkey=None, # 秘密鍵のフルパス (2018/10/24)
        remote_bind_address=('MySQLホスト', ポート番号),
    ) as server:
        conn = pymysql.connect(
            host = 'localhost',
            port = server.local_bind_port,
            user = 'DBユーザー名',
            password = 'DBパスワード',
            database = 'データベース名',
            charset = 'utf8',
        )

        cur = conn.cursor()
        
        # クエリを記述
        cur.execute("""
          SELECT *
          FROM database
        """)

        for row in cur:
            writer.writerow(row)

        cur.close()
        conn.close()

        csv_body = csvfile.getvalue()
        csvfile.close()

    return csv_body

# スプレッドシート認証
def get_sheets_service():
    CLIENT_SECRET_FILE = 'シークレットキーの場所/client_secret.json'
    CREDENTIAL_FILE = 'credential.json'
    APPLICATION_NAME = 'APIの認証で付けた名前'
 
    store = oauth2client.file.Storage(CREDENTIAL_FILE)
    credentials = store.get()
    if not credentials or credentials.invalid:
        SCOPES = 'https://www.googleapis.com/auth/spreadsheets'
        flow = oauth2client.client.flow_from_clientsecrets(CLIENT_SECRET_FILE, SCOPES)
        flow.user_agent = APPLICATION_NAME
        args = '--auth_host_name localhost --logging_level INFO --noauth_local_webserver'
        flags = argparse.ArgumentParser(parents=[oauth2client.tools.argparser]).parse_args(args.split())
        credentials = oauth2client.tools.run_flow(flow, store, flags)
 
    http = credentials.authorize(httplib2.Http())
    discoveryUrl = ('https://sheets.googleapis.com/$discovery/rest?' 'version=v4')
    service = discovery.build('sheets', 'v4', http=http, discoveryServiceUrl=discoveryUrl)
    return service

# CSVデータをもとにシートを更新
def upload_csv_to_spreadsheet(csv_body):
    service = get_sheets_service()
    SPREADSHEET_ID = 'xxxxxxxxxx4XOC21YUWakBQ9i3UlJIto_xxxxxxxx '
    # URLが https://docs.google.com/spreadsheets/d/xxxxxxxxxx4XOC21YUWakBQ9i3UlJIto_xxxxxxxx/edit#gid=0 の場合

    # CSVデータでシートを更新:PasteDataRequest
    requests = [
    {
        'pasteData': {
            'coordinate':{
              'sheetId': 0, # スプレッドシートURLの末尾、gid= に続く数字
              'rowIndex': 0,
              'columnIndex': 0
            },
            'data':csv_body,
            'type':'PASTE_VALUES',
            'delimiter': ',',
 
        }
    }
    ]

    body = {
        'requests': requests
    }
 
    response = service.spreadsheets().batchUpdate(
      spreadsheetId=SPREADSHEET_ID,
      body=body).execute()

def main():
    csv_body = sql_to_csv()
    upload_csv_to_spreadsheet(csv_body)
 
if __name__ == '__main__':
    main()

これをcronで定期実行してスプレッドを自動更新しています。

参考

26
20
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
26
20

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?