Help us understand the problem. What is going on with this article?

『ゼロから作るDeep Learning』4章 メモ

More than 1 year has passed since last update.

斎藤康毅:『ゼロから作るDeep Learning』オライリージャパン,2016.
理解を深めるためにメモします.

4章 ニューラルネットワークの学習

導入される数学の概念

  • 損失関数 (loss function)
  • 2乗和誤差 (mean squared error)
  • 交差エントロピー誤差 (cross entropy error)
  • 数値微分 (numerical differentiation)
  • 偏微分 = 複数の変数からなる関数の微分
  • 勾配 (gradient) = $ \left(\dfrac {\partial f} {\partial x_{0}},\dfrac {\partial f} {\partial x_{1}}\right) $ のように,すべての変数の偏微分をベクトルとしてまとめたもの
  • 勾配降下法 (gradient method)

4.2.1

2乗和誤差 (mean squared error)

$E=\dfrac {1} {2}\displaystyle \sum_{k}\left( y_{k}-t_{k}\right) ^{2}\tag{4.1}$

import numpy as np

def mean_squared_error(y, t):
    return 0.5 * np.sum((y-t)**2)

#「2」を正解とする教師データ
t = [0, 0, 1, 0, 0, 0, 0, 0, 0, 0]
# 例1 :「2」の確率が最も高い場合のニューラルネットワークの出力
y = [0.1, 0.05, 0.6, 0.0, 0.05, 0.1, 0.0, 0.1, 0.0, 0.0]

mean_squared_error(np.array(y), np.array(t))

実行結果
0.097500000000000031

を手計算して理解を深める.

\begin{eqnarray}
E 
&=&\dfrac {1} {2}\displaystyle \sum_{k=1}^{10}\left( y_{k}-t_{k}\right) ^{2}\\
&=&\dfrac {1} {2} \left( \left( y_{1}-t_{1}\right) ^{2}+\left( y_{2}-t_{2}\right) ^{2}+\left( y_{3}-t_{3}\right) ^{2}+\cdots + \left( y_{10}-t_{10}\right) ^{2}\right)\\
&=& \dfrac {1} {2} \left( \left( 0.1-0\right) ^{2}+\left(0.05-0\right) ^{2}+\left(0.6-1\right) ^{2}+\cdots + \left( 0.0-0\right) ^{2}\right)\\
&=& \dfrac {1} {2} \left( \left( 0.1\right) ^{2}+\left(0.05\right) ^{2}+\left(-0.4\right) ^{2}+\cdots + \left( 0.0\right) ^{2}\right)\\
&=& \dfrac {1} {2} \left( \left( 0.1\right) ^{2}+\left(0.05\right) ^{2}+\left(-0.4\right) ^{2}+\left(0.05\right) ^{2}+\left(0.1\right) ^{2}+ \left( 0.1\right) ^{2}+\left( 0.0\right) ^{2}\right)\\
&=& \dfrac {1} {2} \left( \left( 0.01\right)+\left(0.0025\right)+\left(0.16\right)+\left(0.0025\right)+\left( 0.01\right)+\left( 0.01\right)+ \left( 0.0\right)\right)\\
&=& \dfrac {1} {2}\cdot 0.195\\
&=& 0.0975
\end{eqnarray}

間違い等あれば教えていただけると嬉しいです.

随時更新するかも :snowflake:

Why do not you register as a user and use Qiita more conveniently?
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
Comments
Sign up for free and join this conversation.
If you already have a Qiita account
Why do not you register as a user and use Qiita more conveniently?
You need to log in to use this function. Qiita can be used more conveniently after logging in.
You seem to be reading articles frequently this month. Qiita can be used more conveniently after logging in.
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away