Help us understand the problem. What is going on with this article?

重回帰分析の従属変数の不偏分散(参考: The elements of statistical learning)

1. 要約

 本記事では$$\widehat{\sigma}^2=\frac{1}{N-p-1}\sum_{i=1}^N(y_i-\widehat{y}_i)^2\tag{1}$$が分散の不偏推定量になること,つまり,$$E[\widehat{\sigma}^2]=\sigma^2$$となることを証明します.

2. 証明

 さらっと書きましたが,$\widehat{y}_i$は$$\widehat{y}=X\widehat{\beta}=X(X^TX)^{-1}X^Ty$$のi番目の要素で,$$X(X^TX)^{-1}X^T(:=H)$$は一般に射影行列と呼ばれています.
それでは証明(といっても単純な式変形ですが...)にうつります.(1)の係数部分以外のところは以下のように変形できます.

\begin{eqnarray}
\sum_{i=1}^N(y_i-\widehat{y}_i)^2&=&(y-\widehat{y})^T(y-\widehat{y})\\
&=&(y-Hy)^T(y-Hy)\\
&=&[(I-H)y]^T[(I-H)y]\\
&=&y^T(I-H)^T(I-H)y\\
&=&y^T(I-H^T-H+H^TH)y\\
&=&y^T(I-2H+H)y\\
&&(\because H^T=(X(X^TX)^{-1}X^T)^T=X(X^TX)^{-1}X^T=H,\ H^TH=H)\\
&=&y^T(I-H)y\\
&=&(X\beta+e)^T(I-H)(X\beta+e)\\
&=&\beta^TX^TX\beta+\beta^TX^Te-\beta^TX^THX\beta-\beta^TX^THe\\
&&+e^TX\beta+e^Te-e^THX\beta-e^THe\\
&=&\beta^TX^Te-\beta^TX^THe+e^TX\beta+e^Te-e^THX\beta-e^THe(:=A)\\
&&(\because \beta^TX^THX\beta=\beta^TX^TX(X^TX)^{-1}X^TX\beta=\beta^TX^TX\beta)
\end{eqnarray}

ここで,Aに対して期待値を取ると,$E[e]=0_N$であることを利用して,

\begin{eqnarray}
E[A] &=& E[e^Te-e^THe]\\
&=&E[e^Te]-E[e^THe]\\
&=&E[tr(e^Te)]-E[tr(e^THe)]\\
&&(\because e^Te\ and\ e^THe\ are\ scalar.)\\
&=&E[tr(ee^T)]-E[tr(He^Te)]\\
&=&trE[ee^T]-trHE[e^Te]\\
&=&tr\sigma^2I_N-trH\sigma^2I_N\\
&&(\because E[ee^T]=E[(e-0_N)(e-0_N)^T]=Var[e]=\sigma^2I_N)\\
&=&N\sigma^2-\sigma^2trX(X^TX)^{-1}X^T\\
&=&N\sigma^2-\sigma^2tr(X^TX)^{-1}X^TX\\
&=&N\sigma^2-\sigma^2trI_{p+1}\\
&=&(N-(p+1))\sigma^2
\end{eqnarray}

よってAを$N-p-1$で割ったもの,つまり,$$\widehat{\sigma}^2=\frac{1}{N-p-1}\sum_{i=1}^N(y_i-\widehat{y}_i)^2$$が分散の不偏推定量となることがわかりました.個体数-1で割るのじゃなくて,個体数-(変数の数+切片の分)で割るというのがポイントですね.

おまけ

以前の記事(稲垣「数理統計学」定理6.3の証明)のようなのと同様に$(N-p-1)\widehat{\sigma}^2$は,$$(N-p-1)\widehat{\sigma}^2\sim\sigma^2\chi_{N-p-1}^2$$に従います.

参考文献

Hastie, Trevor, Tibshirani, Robert, Friedman, Jerome(2009).The elements of statistical learning のp.47あたりの行間を埋めてます.

Why not register and get more from Qiita?
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
Comments
No comments
Sign up for free and join this conversation.
If you already have a Qiita account
Why do not you register as a user and use Qiita more conveniently?
You need to log in to use this function. Qiita can be used more conveniently after logging in.
You seem to be reading articles frequently this month. Qiita can be used more conveniently after logging in.
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
ユーザーは見つかりませんでした