2
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

回帰モデルで扱われる残差二乗和(誤差)の式の備忘録メモ

線形回帰の残差二乗和

\sum_{i=1}^{n} (y_i - \hat{y}_i)^2

ラッソ回帰の残差二乗和

\sum_{i=1}^{n} (y_i - \hat{y}_i)^2 + \alpha \sum_{k=1}^{d} |\omega_k|

$$\alpha: 正則化項, \sum_{k=1}^{d} |\omega_k|: 切片以外の全てのパラメータの絶対値の和$$

scikit-learnの場合

scikit-learnでは以下の式が残差二乗和として用いられている。

\sum_{i=1}^{n} (y_i - \hat{y}_i)^2 + 2 \cdot n \cdot \alpha \sum_{k=1}^{d} |\omega_k|

リッジ回帰の残差二乗和

\sum_{i=1}^{n} (y_i - \hat{y}_i)^2 + \alpha \sum_{k=1}^{d} |\omega_k|^2

$$\alpha: 正則化項, \sum_{k=1}^{d} |\omega_k|: 切片以外の全てのパラメータの絶対値の二乗の和$$

Reference

2
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?