0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

GCPでAWSブログの証明書関連記事を定期チェックして通知するBotを作る

Posted at

GCPでAWSブログの証明書関連記事を定期チェックして通知するBotを作る

📝 目的

AWS Security Blog に「ACM」や「certificate」など証明書に関連する記事が投稿された場合に、
Gmailで自動通知を送るBot を Google Cloud Platform 上に構築する。

  • チェック対象: https://aws.amazon.com/blogs/security/
  • 通知先:Gmail(App Password利用)
  • 定期実行:毎週月曜 9時(JST)
  • 重複通知防止:Cloud Storageを使用

🛠 利用技術構成

  • Cloud Functions(Gen2 / Python 3.10)
  • Cloud Pub/Sub(トリガー用)
  • Cloud Scheduler(定期実行)
  • Cloud Storage(過去通知済みURLの保存)
  • Gmail SMTP(通知送信)
  • その他:requests, beautifulsoup4, google-cloud-storage ライブラリ

GCPの準備

まずはGCPでプロジェクトをつくるのとプロジェクトを課金アカウント紐つけをすること。

各種APIなどを有効化するのを最初にやること。

✅ 有効化が必要な GCP API 一覧
Cloud Functions API 関数本体
Cloud Build API 関数デプロイ処理
Cloud Scheduler API 定期実行のスケジューラ
Pub/Sub API トリガー処理
Cloud Storage API 重複通知ファイル管理
Eventarc API Gen2トリガー
Cloud Run API Cloud Functionの裏側実行基盤
Artifact Registry API ビルド成果物保存

📦 スクリプト一式

ローカルフォルダに以下の3つのファイルを入れて最後の「deploy.sh」にchmod +xをして実行する。

main.py(テスト通知モード)

import base64
import requests
from bs4 import BeautifulSoup
import smtplib
from email.mime.text import MIMEText
from google.cloud import storage
import os

# 環境変数で設定
BUCKET_NAME = os.environ.get("BUCKET_NAME", "aws-cert-alert-storage-xxx")
STORED_FILE_NAME = "last_alerted_urls.txt"

# AWS Blog URL
BLOG_URL = "https://aws.amazon.com/blogs/security/"
KEYWORDS = ["ACM", "certificate", "certificates", "Starfield"]

# メール設定
SMTP_SERVER = "smtp.gmail.com"
SMTP_PORT = 587
EMAIL_SENDER = "xxxxx@gmail.com"
EMAIL_PASSWORD = "password"
EMAIL_RECEIVERS = [
    "xxxxx@cisco.com"
]

#テスト用
#def fetch_matching_articles():
#    return [("【テスト通知】ACM に関する新しい記事が検出されました", "https://aws.amazon.com/blogs/security/fake-article2")]

def fetch_matching_articles():
    res = requests.get(BLOG_URL)
    if res.status_code != 200:
        raise Exception("Failed to fetch AWS blog page.")

    soup = BeautifulSoup(res.text, 'html.parser')
    articles = soup.find_all("h2", class_="aws-blog-title")

    matched = []

    for article in articles:
        title = article.get_text(strip=True)
        a_tag = article.find("a")
        if not a_tag:
            continue

        url = a_tag["href"]
        article_res = requests.get(url)
        if article_res.status_code != 200:
            continue

        article_soup = BeautifulSoup(article_res.text, 'html.parser')
        full_text = article_soup.get_text().lower()

        if any(keyword.lower() in title.lower() or keyword.lower() in full_text for keyword in KEYWORDS):
            matched.append((title, url))

    return matched

def load_sent_urls():
    client = storage.Client()
    bucket = client.bucket(BUCKET_NAME)
    blob = bucket.blob(STORED_FILE_NAME)

    if not blob.exists():
        return set()

    content = blob.download_as_text()
    return set(content.strip().splitlines())

def save_sent_urls(urls):
    client = storage.Client()
    bucket = client.bucket(BUCKET_NAME)
    blob = bucket.blob(STORED_FILE_NAME)
    blob.upload_from_string("\n".join(urls))

def send_email(matches):
    subject = "🔔 AWS証明書関連の新しいブログ記事があります"
    body = "以下の記事が見つかりました:\n\n"
    for title, url in matches:
        body += f"{title}\n{url}\n\n"

    msg = MIMEText(body)
    msg["Subject"] = subject
    msg["From"] = EMAIL_SENDER
    msg["To"] = ", ".join(EMAIL_RECEIVERS)

    with smtplib.SMTP(SMTP_SERVER, SMTP_PORT) as server:
        server.starttls()
        server.login(EMAIL_SENDER, EMAIL_PASSWORD)
        server.sendmail(EMAIL_SENDER, EMAIL_RECEIVERS, msg.as_string())

def main(event, context):
    try:
        if 'data' in event:
            message_data = base64.b64decode(event['data']).decode('utf-8')
            print(f"Pub/Sub message: {message_data}")

        matches = fetch_matching_articles()
        if not matches:
            print("🔍 No matching articles found.")
            return

        sent_urls = load_sent_urls()
        new_matches = [(t, u) for (t, u) in matches if u not in sent_urls]

        if new_matches:
            send_email(new_matches)
            all_sent_urls = sent_urls.union({u for _, u in new_matches})
            save_sent_urls(all_sent_urls)
            print(f"✅ Sent {len(new_matches)} new article(s).")
        else:
            print("ℹ️ All matched articles were already notified.")

    except Exception as e:
        print(f"❌ Error: {str(e)}")

requirements.txt

requests
beautifulsoup4
google-cloud-storage

deploy.sh

#!/bin/bash
PROJECT_ID=$(gcloud config get-value project)
REGION="asia-northeast1"
FUNCTION_NAME="aws_cert_alert_function"
TOPIC_NAME="aws-cert-alert"
SCHEDULER_NAME="weekly-aws-cert-alert"
SCHEDULE="0 9 * * 1"
BUCKET_NAME="aws-cert-alert-storage-xxxxx"  # ご自身で作成したCloud Storageバケット名

echo "✅ Creating Pub/Sub topic"
gcloud pubsub topics describe $TOPIC_NAME > /dev/null 2>&1 || \
  gcloud pubsub topics create $TOPIC_NAME

echo "🚀 Deploying Cloud Function"
gcloud functions deploy $FUNCTION_NAME \
  --runtime python310 \
  --trigger-topic $TOPIC_NAME \
  --entry-point main \
  --source . \
  --region $REGION \
  --set-env-vars BUCKET_NAME=$BUCKET_NAME \
  --quiet

SERVICE_ACCOUNT=$(gcloud functions describe $FUNCTION_NAME --region=$REGION --format='value(serviceAccountEmail)')
if [ -n "$SERVICE_ACCOUNT" ]; then
  echo "🔐 Granting storage access"
  gcloud projects add-iam-policy-binding $PROJECT_ID \
    --member="serviceAccount:$SERVICE_ACCOUNT" \
    --role="roles/storage.objectAdmin" \
    --quiet
fi

echo "⏰ Creating/Updating Scheduler"
gcloud scheduler jobs describe $SCHEDULER_NAME --location=$REGION > /dev/null 2>&1 || \
  gcloud scheduler jobs create pubsub $SCHEDULER_NAME \
    --schedule "$SCHEDULE" \
    --time-zone "Asia/Tokyo" \
    --topic $TOPIC_NAME \
    --message-body "trigger-check" \
    --location $REGION

🧪 テスト実行 & ログ確認方法

Pub/Subで手動テスト

gcloud pubsub topics publish aws-cert-alert --message "test-notification"

ログ確認(CLI)

gcloud functions logs read aws_cert_alert_function \
  --region=asia-northeast1 \
  --limit=20 \
  --gen2

✅ まとめ
この記事の仕組みを使えば、

定期的にセキュリティブログを巡回して、
キーワード一致を判定し、
Gmailで即通知、
過去通知の重複も防止

といった自動チェックBotが完成します ✅

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?