Help us understand the problem. What is going on with this article?

[memo]ML.線形回帰(式)

単回帰

  • 線形モデル
\hat{y}=w_1x + w_0
  • 残差(residual)
\hat{y_i}-y_i
  • コスト関数(二乗誤差)
J(w_0, w_1)=\frac{\sum_{i=1}^{m}{(\hat{y_i}-y_i})^2}{{2m}}\\
・斜面の傾きを計算する偏微分
  • 最急降下法
    • 最適化アルゴリズム
      • コストを最も低くする$w_0$,$w_1$を導き出す
      • 最急降下法(Gradient Decent)が一般的
        • $w_0$,$w_1$の最適化
        • $J$の最小化
w_0=:w_0 - \alpha \frac{\partial}{\partial w_0}J(w_0, w_1)=w_0 - \alpha\frac{1}{m}\sum_{i=0}^{m}(\hat{y} - y_i)\\
w_1=:w_1 - \alpha \frac{\partial}{\partial w_1}J(w_0, w_1)=w_1 - \alpha\frac{1}{m}\sum_{i=0}^{m}(\hat{y} - y_i)x_i\\
=:は値を更新する\\
\alphaは学習率(learning\hspace{5pt}rate)\\

重回帰

  • 線形モデル
Y = XW
  • コスト関数
J(W) = \frac{1}{2m}\sum_{i=0}^n(\hat{y}-y_i)^2 \\
= \frac{1}{2m}(XW-y)^T(XW-y)\\
  • 最急降下法
W=:W - \alpha\frac{1}{m}X^T(XW-y)\\
  • 正規化(feature scaling/normalization)

1) zero score normalization

x_1 = \frac{x_1-\bar{x}}{\sigma}

2) min-max normalization

x_1 = \frac{x_1-x_{min}}{x_{max}-x_{min}}

疑問:
学習率が高すぎると最小値を通り過ぎて、だんだん離れていく???

ref:
- Qiitaの数式チートシート
- 線形回帰 入門
- Introduction to residuals and least-squares regression

Why do not you register as a user and use Qiita more conveniently?
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
Comments
Sign up for free and join this conversation.
If you already have a Qiita account
Why do not you register as a user and use Qiita more conveniently?
You need to log in to use this function. Qiita can be used more conveniently after logging in.
You seem to be reading articles frequently this month. Qiita can be used more conveniently after logging in.
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away