3
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

TerraformでBigQuery Data Transfer Service for Amazon S3を設定する

Last updated at Posted at 2021-10-02
data "google_project" "project" {
}

resource "google_project_iam_member" "permissions" {
  role   = "roles/iam.serviceAccountShortTermTokenMinter"
  member = "serviceAccount:service-${data.google_project.project.number}@gcp-sa-bigquerydatatransfer.iam.gserviceaccount.com"
}

resource "google_bigquery_data_transfer_config" "s3-transfer" {
  depends_on             = [google_project_iam_member.permissions]

  display_name           = "s3-transfer"
  location               = "asia-northeast1"
  data_source_id         = "amazon_s3"
  schedule               = "every day 21:00" # UTC
  destination_dataset_id = google_bigquery_dataset.my_dataset.dataset_id
  params                 = {
    destination_table_name_template = "my_table"
    data_path                       = "s3://my_bucket/my_prefix/*"
    access_key_id                   = var.aws_access_key
    secret_access_key               = var.aws_secret_key
    file_format                     = "PARQUET"
  }
}

Terraformのドキュメントにはs3転送例は載っていないためdata_source_idparamsの項目は個別に調べる必要がある。BigQuery Data Transfer Service がサポートするデータソースはドキュメント記載のコマンド出力から得られる。

3
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
3
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?