0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

最尤推定

Posted at

【要点】

  • ベルヌーイ(“10点=成功”)を n 回観測し,成功回数 k のときの最尤推定量は θ̂ = k/n
  • 対数尤度で微分→0 を解くだけ。境界例 k=0→θ̂=0, k=n→θ̂=1。

【導出(1標本:二項分布)】

  • 尤度:L(θ)=C(n,k) θ^k (1-θ)^{n-k}
  • 対数尤度:ℓ(θ)=log L = const + k log θ + (n-k) log(1-θ)
  • 一階微分:dℓ/dθ = k/θ - (n-k)/(1-θ)
  • 最尤条件:dℓ/dθ=0 ⇒ k(1-θ)=(n-k)θ ⇒ θ̂=k/n
  • 二階微分:d²ℓ/dθ² = -k/θ² - (n-k)/(1-θ)² < 0(極大)

【複数データ集合(独立に nᵢ 回・成功 kᵢ 回が M 組)】

  • 総尤度:L(θ)=∏ C(nᵢ,kᵢ) θ^{kᵢ}(1-θ)^{nᵢ-kᵢ}
  • 対数尤度:ℓ(θ)=const + (∑kᵢ) log θ + (∑(nᵢ-kᵢ)) log(1-θ)
  • 微分=0 ⇒ θ̂ = (∑kᵢ)/(∑nᵢ)(全体の成功率)

【分散・不偏性(大標本近似)】

  • Fisher情報:I(θ)=n/[θ(1-θ)]
  • 近似分散:Var(θ̂)≈θ(1-θ)/n,95%CI ≈ θ̂ ± 1.96√(θ̂(1-θ̂)/n)

【テキスト中の例】

  1. 300射で10点が20回:n=300, k=20
     ⇒ θ̂₁=20/300=1/15 ≈ 0.0667

  2. 600射で10点が48回:n=600, k=48
     ⇒ θ̂₂=48/600=0.08=2/25

(両方まとめて一つの θ を推定したいなら)
 θ̂_combined=(20+48)/(300+600)=68/900=0.0756…

【多パラメータ版(教科書の「連立方程式」)】

  • パラメータベクトル θ=(θ₁,…,θ_m) のとき:
     ∂ℓ/∂θⱼ=0(j=1…m)を同時に解く。ヘッセ行列の負定値で極大を確認。

【計算の型(そのまま使える)】

  • logL = k*log(theta) + (n-k)*log(1-theta)
  • 微分=0 → k/theta - (n-k)/(1-theta) = 0theta_hat = k/n

✅ Pythonスクリプト:ベルヌーイ分布の最尤推定(MLE)

# Program Name: bernoulli_mle_estimator.py
# Overview: Derive and visualize MLE for Bernoulli/binomial success probability θ
# Usage: Plug in values of n (trials) and k (successes)

import numpy as np
import matplotlib.pyplot as plt
from scipy.stats import binom
from scipy.optimize import minimize_scalar

# -----------------------
# Data input
# -----------------------
n = 300            # 試行回数
k = 20             # 成功回数
theta_hat = k / n  # 最尤推定値(MLE)

# -----------------------
# 対数尤度関数
# -----------------------
def log_likelihood(theta):
    if theta <= 0 or theta >= 1:
        return -np.inf
    return k * np.log(theta) + (n - k) * np.log(1 - theta)

# -----------------------
# 最大化
# -----------------------
res = minimize_scalar(lambda t: -log_likelihood(t), bounds=(0.0001, 0.9999), method='bounded')
theta_mle = res.x

# -----------------------
# 近似分散と信頼区間
# -----------------------
var_theta = theta_hat * (1 - theta_hat) / n
std_error = np.sqrt(var_theta)
ci_95 = (theta_hat - 1.96 * std_error, theta_hat + 1.96 * std_error)

# -----------------------
# プロット
# -----------------------
theta_range = np.linspace(0.0001, 0.9999, 500)
logL_vals = [log_likelihood(t) for t in theta_range]

plt.plot(theta_range, logL_vals, label='log-likelihood ℓ(θ)')
plt.axvline(theta_hat, color='r', linestyle='--', label=f'θ̂ = {theta_hat:.4f}')
plt.axvspan(*ci_95, alpha=0.2, color='red', label='95% CI')

plt.title(f'Bernoulli MLE: n={n}, k={k}')
plt.xlabel('θ')
plt.ylabel('log ℓ(θ)')
plt.legend()
plt.grid(True)
plt.show()

# -----------------------
# 結果出力
# -----------------------
print(f"✅ 最尤推定量(MLE): θ̂ = {theta_hat:.5f}")
print(f"🔁 近似標準誤差: SE = {std_error:.5f}")
print(f"📊 95% 信頼区間: {ci_95[0]:.5f}{ci_95[1]:.5f}")

✅ 実行例:n = 300, k = 20

✅ 最尤推定量(MLE): θ̂ = 0.06667
🔁 近似標準誤差: SE = 0.01445
📊 95% 信頼区間: 0.03835 〜 0.09498

🔄 拡張例(複数データ統合)

複数の観測(nᵢ, kᵢ)をまとめたいときは:

n_total = sum([300, 600])
k_total = sum([20, 48])
theta_hat_combined = k_total / n_total

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?