15
4

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 1 year has passed since last update.

dockerでFastAPIとpostgresql

Last updated at Posted at 2023-05-16

はじめに

FastAPIとpostgresqlの環境構築を行いました。
LaravelやRailsのフレームワークに慣れていたのもあり、FastAPIを導入する際に少し面倒くさく感じたので、
備忘録を兼ねて環境構築の手順を書きました。
各コードの詳細については省いてます。

忙しい人のために最終的なソースです。
https://github.com/nonamenme/docker-fastapi-postgres
こちらをdocker-compose up -dで立ち上げれば、とりあえずFastAPIの環境は完成します。

1. 準備

以下の構成でファイルを用意

─── project
    ├── docker-compose.yml
    ├── Dockerfile
    ├── requirements.txt
    └── fastapi
        └── main.py

・各ファイルの中身

docker-compose.yml
version: '3.7'

services:
  fastapi:
    build: .
    volumes:
      - ./fastapi:/app
    ports:
      - 8000:8000
    restart: always
    tty: true
    depends_on:
      - db

  db:
    image: postgres:15
    container_name: postgres-db
    volumes:
      - postgres_data:/var/lib/postgresql/data/
    environment:
      - POSTGRES_USER=postgres
      - POSTGRES_PASSWORD=password
    ports:
      - 5432:5432

volumes:
  postgres_data:

・コンテナ名などは必要に応じて変更

dockerfile
FROM python:3.9-alpine

ENV LANG C.UTF-8
ENV TZ Asia/Tokyo

WORKDIR /app

# pip installs
COPY ./requirements.txt requirements.txt

RUN apk add --no-cache postgresql-libs \
 && apk add --no-cache --virtual .build-deps gcc musl-dev postgresql-dev \
 && python3 -m pip install -r /app/requirements.txt --no-cache-dir \
 && apk --purge del .build-deps

COPY . /app

# FastAPIの起動
CMD ["uvicorn", "main:app", "--reload", "--host", "0.0.0.0", "--port", "8000"]

・"--reload"をつけることで、main.pyに変更が入った際に即時反映することができる

main.py
import uvicorn
from fastapi import FastAPI

app = FastAPI()

@app.get("/")
async def root():
    return {"message": "Hello World"}

if __name__ == "__main__":
    uvicorn.run(app, host="0.0.0.0", port=8000)
requirements.txt
fastapi
uvicorn

SQLAlchemy==1.3.22
SQLAlchemy-Utils==0.41.1
alembic==1.5.2
psycopg2==2.8.6
psycopg2-binary==2.9.3
pydantic[email]==1.6.1
python-jose[cryptography]==3.2.0
python-multipart==0.0.6
python-dotenv==1.0.0

2. コンテナを立ち上げる

$ docker-compose up -d

3. コンテナが立ち上がっているか確認

$ docker-compose ps
       Name                     Command               State           Ports         
------------------------------------------------------------------------------------
fast-api_fastapi_1   uvicorn main:app --reload  ...   Up      0.0.0.0:8000->8000/tcp
postgres-db          docker-entrypoint.sh postgres    Up      0.0.0.0:5432->5432/tcp

この時点でアプリは立ち上がっている。
http://localhost:8000 をブラウザで開くと

{"message":"Hello World"}

と表示されているはず。

また、http://localhost:8000/docs をブラウザで開くと、SwaggerUIで参照も出来る。
スクリーンショット 2023-05-12 19.59.59.png

4. alembicでmigrationを作成

ここが少し面倒くさかった・・・。
・alembicはDBマイグレーションツール

4-1. アプリコンテナに入る

$ docker-compose exec fastapi sh

4-2. alembic initでmigration環境の作成

/app # alembic init migration
  Creating directory /app/migration ...  done
  Creating directory /app/migration/versions ...  done
  Generating /app/migration/script.py.mako ...  done
  Generating /app/migration/README ...  done
  Generating /app/alembic.ini ...  done
  Generating /app/migration/env.py ...  done
  Please edit configuration/connection/logging settings in '/app/alembic.ini' before proceeding.

ここまで完了すると、以下のようなファイル構成ができている。

─── project
    ├── docker-compose.yml
    ├── Dockerfile
    ├── requirements.txt
    └── fastapi
        ├── main.py
+       ├── alembic.ini
+       └── migration
+           ├── versions
+           ├── env.py
+           ├── README
+           └── script.py.mako

4-3. dbの接続先を変更する

4-3-1. .envの作成

/app # touch .env
.env
DATABASE_URL=postgresql://postgres:password@postgres-db:5432/postgres

4-3-2. core/config.pyの作成

/app # mkdir core
/app # touch config.py

4-3-3. 環境変数を読み込む為の設定ファイルを作成

core/config.py
import os
from functools import lru_cache
from pydantic import BaseSettings

PROJECT_ROOT = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

class Environment(BaseSettings):
    """ 環境変数を読み込む
    """
    database_url: str

    class Config:
        env_file = os.path.join(PROJECT_ROOT, '.env')

@lru_cache
def get_env():
    """ @lru_cacheで.envの結果をキャッシュする
    """
    return Environment()

4-3-4. 接続先のDBを設定

migration/env.py
+ import sys
+ # 相対パスでcoreディレクトリが参照できないので、読み取れるように
+ sys.path = ['', '..'] + sys.path[1:]

+ import os
+ from core.config import PROJECT_ROOT
+ from dotenv import load_dotenv

from logging.config import fileConfig

from sqlalchemy import engine_from_config
from sqlalchemy import pool

from alembic import context

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
target_metadata = None

# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
+ load_dotenv(dotenv_path=os.path.join(PROJECT_ROOT, '.env'))
+ config.set_main_option('sqlalchemy.url', os.getenv('DATABASE_URL'))

def run_migrations_offline():
...

4-4. 生成されるmigrationファイルのファイル名を変更する

alembic.ini
# A generic, single database configuration.

[alembic]
# path to migration scripts
script_location = migration

# template used to generate migration files
# file_template = %%(rev)s_%%(slug)s
+ file_template = %%(year)d%%(month).2d%%(day).2d%%(hour).2d%%(minute).2d_%%(slug)s

# timezone to use when rendering the date
# within the migration file as well as the filename.
# string value is passed to dateutil.tz.gettz()
# leave blank for localtime
# timezone =
+ timezone = Asia/Tokyo

# max length of characters to apply to the
# "slug" field
# truncate_slug_length = 40
...

4-5. migrationファイルを作成

/app # alembic revision -m "create users table"
  Generating /app/migration/versions/202305121847_create_users_table.py ...  done

これでmigration/version/配下にmigrationファイルが生成される。

5. migrationの実行

5-1. migrationファイルの変更

xxxx_create_users_table.py
"""create users table

Revision ID: xxxxxxxx
Revises: 
Create Date: YYYY-MM-dd hh:ss:mm.ssssss

"""
from alembic import op
import sqlalchemy as sa


# revision identifiers, used by Alembic.
revision = 'xxxxxxxx'
down_revision = None
branch_labels = None
depends_on = None


def upgrade():
-   pass
+   op.create_table(
+       'users',
+        sa.Column('id', sa.Integer, primary_key=True),
+        sa.Column('name', sa.String(50), nullable=False),
+        sa.Column('login_id', sa.String(50), nullable=False),
+        sa.Column('password', sa.Text(), nullable=False),
+   )

def downgrade():
-   pass
+   op.drop_table('users')

5-2. migrationの実行

/app # alembic upgrade head
INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
INFO  [alembic.runtime.migration] Running upgrade  -> xxxxxxxx, create users table

これでmigrationが実行され、postgresqlのDBへusersテーブルが作成される。

6. Modelを元にmigrationファイルを生成するように

6-1. models.pyを作成

/app # touch migration/models.py
migration/models.py
from datetime import datetime

from sqlalchemy import create_engine, Column, String, Integer, Text, DateTime
from sqlalchemy.ext.declarative import declarative_base
from core.config import get_env

# Engine の作成
Engine = create_engine(
    get_env().database_url,
    encoding="utf-8",
    echo=False
)

BaseModel = declarative_base()

6-2. Userモデルを作成

migration/models.py
from datetime import datetime

from sqlalchemy import create_engine, Column, String, Integer, Text, DateTime
from sqlalchemy.ext.declarative import declarative_base
from core.config import get_env

# Engine の作成
Engine = create_engine(
    get_env().database_url,
    encoding="utf-8",
    echo=False
)

BaseModel = declarative_base()

+ class User(BaseModel):
+    __tablename__ = 'users'
+
+    id = Column(Integer, primary_key=True)
+    name = Column(String(50), nullable=False)
+    login_id = Column(String(50), unique=True, nullable=False)
+    password = Column(Text, nullable=False)
+    created_at = Column(DateTime, default=datetime.now, nullable=False) # 追加分
+    updated_at = Column(DateTime, default=datetime.now, nullable=False) # 追加分

・ migrationファイルとの差分は created_atupdated_at

6-3. 作成したmodelをalembicで呼び出すように設定する

migration/env.py
import sys
sys.path = ['', '..'] + sys.path[1:]

import os
from core.config import PROJECT_ROOT
from dotenv import load_dotenv

from logging.config import fileConfig

- from sqlalchemy import engine_from_config
- from sqlalchemy import pool

from alembic import context

+ from migration.models import BaseModel, Engine

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support
# from myapp import mymodel
# target_metadata = mymodel.Base.metadata
- target_metadata = None
+ # target_metadata = None
+ target_metadata = BaseModel.metadata


# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option")
# ... etc.
load_dotenv(dotenv_path=os.path.join(PROJECT_ROOT, '.env'))
config.set_main_option('sqlalchemy.url', os.getenv('DATABASE_URL'))

def run_migrations_offline():
    """Run migrations in 'offline' mode.

    This configures the context with just a URL
    and not an Engine, though an Engine is acceptable
    here as well.  By skipping the Engine creation
    we don't even need a DBAPI to be available.

    Calls to context.execute() here emit the given string to the
    script output.

    """
    url = config.get_main_option("sqlalchemy.url")
    context.configure(
        url=url,
        target_metadata=target_metadata,
        literal_binds=True,
        dialect_opts={"paramstyle": "named"},
    )

    with context.begin_transaction():
        context.run_migrations()


def run_migrations_online():
    """Run migrations in 'online' mode.

    In this scenario we need to create an Engine
    and associate a connection with the context.

    """
-   connectable = engine_from_config(
-       config.get_section(config.config_ini_section),
-       prefix="sqlalchemy.",
-       poolclass=pool.NullPool,
-   )

+   # connectable = engine_from_config(
+   #     config.get_section(config.config_ini_section),
+   #     prefix="sqlalchemy.",
+   #     poolclass=pool.NullPool,
+   # )
+   url = config.get_main_option("sqlalchemy.url")
+   connectable = Engine

    with connectable.connect() as connection:
        context.configure(
+           url=url,
            connection=connection,
            target_metadata=target_metadata
        )

        with context.begin_transaction():
            context.run_migrations()


if context.is_offline_mode():
    run_migrations_offline()
else:
    run_migrations_online()

6-4. migrationファイルの生成

/app # alembic revision --autogenerate -m "add columns"
INFO  [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO  [alembic.runtime.migration] Will assume transactional DDL.
INFO  [alembic.ddl.postgresql] Detected sequence named 'users_id_seq' as owned by integer column 'users(id)', assuming SERIAL and omitting
INFO  [alembic.autogenerate.compare] Detected added column 'users.created_at'
INFO  [alembic.autogenerate.compare] Detected added column 'users.updated_at'
INFO  [alembic.autogenerate.compare] Detected added unique constraint 'None' on '['login_id']'
  Generating /app/migration/versions/YYYYMMddHHmm_add_columns.py ...  done

--autogenerateオプションをつけることで、models.pyを元にすでにあるmigrationファイルとの差分のmigrationファイルを作成してくれる。

YYYYMMddHHmm_add_columns.py
"""add columns

Revision ID: xxxxxxx
Revises: yyyyyyyyy
Create Date: YYYY-MM-dd HH:mm:ss.ssssss+09:00

"""
from alembic import op
import sqlalchemy as sa


# revision identifiers, used by Alembic.
revision = 'xxxxxxx'
down_revision = 'yyyyyyyyy'
branch_labels = None
depends_on = None


def upgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.add_column('users', sa.Column('created_at', sa.DateTime(), nullable=False))
    op.add_column('users', sa.Column('updated_at', sa.DateTime(), nullable=False))
    op.create_unique_constraint(None, 'users', ['login_id'])
    # ### end Alembic commands ###


def downgrade():
    # ### commands auto generated by Alembic - please adjust! ###
    op.drop_constraint(None, 'users', type_='unique')
    op.drop_column('users', 'updated_at')
    op.drop_column('users', 'created_at')
    # ### end Alembic commands ###

6-5. migrationの実行

alembic upgrade head

7. 完了

ここまでで環境構築としては完了。
初めてのFastAPIでの環境構築で戸惑いまくりでしたが、楽しかったです。

参考リンク

https://qiita.com/Butterthon/items/a55daa0e7f168fee7ef0
https://qiita.com/penpenta/items/c993243c4ceee3840f30
https://qiita.com/hkyo/items/65321d7015121ccf369f

15
4
2

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
15
4

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?