0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

PRML 演習問題 3.4 解答

Last updated at Posted at 2020-06-06

問題

次の形の線形モデル
$$
\begin{align*}
{y} (\mathbf{x}, \mathbf{w})=w_{0}+\sum_{i=1}^{D} w_{i} x_{i}
\tag{3.105}
\end{align*}
$$
と二乗和誤差関数

\begin{align}
E_{D}(\mathbf{w})=\frac{1}{2} \sum_{n=1}^{N}\left\{y\left(\mathbf{x}_{n}, \mathbf{w}\right)-t_{n}\right\}^{2}
\tag{3.106}
\end{align}

を考える。
平均 $0$、分散 $\sigma^{2}$のガウスノイズ $\epsilon_{i}$ が独立にそれぞれの入力変数 $x_{i}$に加わるものとする。$\mathbb{E} [\epsilon_{i}]$ = $0$ と $\mathbb{E} [\epsilon_{i}\epsilon_{j}]$ = $\delta_{i j}\sigma^{2}$ の2つの性質を用いて、$E_{D}$のノイズ分布に関する平均を最小にすることは、ノイズのない入力変数に対する二乗和誤差関数と荷重元帥の正則化項の和を最小にすることと等価であることを示せ。ただし、正則化項にバイアスパラメータ $\omega_{0}$ は含めない。

解答

ノイズが加わった入力 $x$ を、

\begin{align}
\tilde{x}_{n i}=x_{n i}+\epsilon_{n i}
\end{align}

とすると、対応する出力 $y$ は

\begin{align}
\tilde{y}_{n}=y\left(\tilde{x}_{n}, w\right)=w_{0}+\sum_{k=1}^{D} w_{i} \tilde{x}_{n i}=w_{0}+\sum_{k=1}^{D}\left(x_{n i} +\epsilon_{n i} \right)=y_{n}+\sum_{i=1}^{D} w_{i} \epsilon_{n i}
\end{align}

となる。
ノイズが加わった状態での二乗和誤差を $\tilde{E}$ とすると、

\begin{align}
\tilde{E} &=\frac{1}{2} \sum_{n=1}^{N}\left(\tilde{y}_{n}-t_{n}\right)^{2}=\frac{1}{2} \sum_{i=1}^{N}\left(\tilde{y}_{n}^{2}-2 \tilde{y}_{n} t_{n}+t_{n}^{2}\right) \\
&=\frac{1}{2} \sum_{n=1}^{N}\left(y_{n}^{2}+2 y_{n} \sum_{n=1}^{D} w_{i} \epsilon_{n i}+\left(\sum_{i=1}^{D} w_{i} \epsilon_{n i}\right)^{2}-2 y_{n} t_{n}-2\left(\sum_{i=1}^{D} w_{i} \epsilon_{n i}\right) t_{n}+t_{n}^{2}\right)
\end{align}

問題文より、$\mathbb{E}[\epsilon_{i}]$ = $0$ なので、上式の第2項と第5項は $\epsilon_{i}$ についての期待値をとると0となる。第3項について、誤差に関して期待値をとると、

\begin{align*}
\mathbb{E}_{\epsilon}\left[\frac{1}{2} \sum_{n=1}^{N}\left(\sum_{k=1}^{D} w_{i} \epsilon_{n i}\right)^{2}\right] \
& = \frac{1}{2} \sum_{n=1}^{N} E_{\epsilon}\left[\left(\sum_{x=1}^{D} w_{i} \epsilon_{n i})^{2}\right]\right. \\
& =\frac{1}{2} \sum_{k=1}^{N} \mathbb{E}_{\epsilon}\left[\left(w_{1} \epsilon_{n_{1}}+\cdots+w_{D} \epsilon_{n_{D}}\right)^{2}\right] \\
& =\frac{1}{2} \sum_{n=1}^{N} \mathbb{E}_{\epsilon}\left[\left(w_{1}^{2} \epsilon_{n 1}^{2}+\cdots +w_{D}^{2} \epsilon_{n D}^{2}\right)\right. + 2\left(w_{1} w_{2} \epsilon_{n 1} \epsilon_{n 2}+\cdots+w_{D-1} w_{D}\epsilon_{n D-1} \epsilon_{n D})\right] \\
& = \frac{1}{2}\sum_{n=1}^{N}\mathbb{E}_{\epsilon}\left[w_{1}^{2} \epsilon_{n 1}^{2}+\cdots+w_{D}^{2} \epsilon_{n D}^{2}\right]+2E_{\epsilon}\left[w_{1} w_{2} \epsilon_{n 1} \epsilon_{n 2}+\cdots+w_{D-1} w_{D}\left(\epsilon_{n D-1} \epsilon_{n D}\right]\right. \\
& =\frac{1}{2} \sum_{n=1}^{N}\left[\mathbb{E}_{\epsilon}\left[\sum_{i=1}^{D} w_{i}^{2} \epsilon_{n i}^{2}\right] 
= \frac{1}{2} \sum_{n=1}^{N} \sum_{i=1}^{D} w_{i}^{2} \sigma^{2}
=\frac{N \sigma^{2}}{2} \sum_{i=1}^{n} w_{i}^{2}\right.
\end{align*}

よって、

\begin{align*}
\mathbb{E}_{\epsilon}[\tilde{E}] \
& =E_{\epsilon}\left[\frac{1}{2} \sum_{i=1}^{N}\left(y_{n}^{2}-2 y_{n} t_{n}+t_{n}^{2}\right)\right]+\frac{N\sigma^{2}}{2} \sum_{i=1}^{D} w_{i}^{2} \\
& =\frac{1}{2} \sum_{n=1}^{N}\left(t_{n}-y_{n}\right)^{2}+\frac{N\sigma^{2}}{2} \sum_{i=1}^{n} w_{i}^{2} \\
& =E_{D}(\mathbf{w})+\frac{n\sigma^{2}}{2} \mathbf{w}^{\top} \mathbf{w}
\end{align*}

よって、題意は示された。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?