0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

【E資格用】最適化手法 暗記用チートシート

0
Last updated at Posted at 2026-02-14

はじめに

E資格勉強のために最適化手法の数式を暗記したい人向けのチートシートです。
それぞれの参考書などで記法が統一されておらず混乱したので、自分の理解のしやすい形でまとめてみました。

手法 更新式(暗記用) 補助変数更新式 特徴・ポイント
最急降下法(Batch GD) $ \theta_{t+1} = \theta_t - \eta \cdot \nabla_\theta L(\theta_t) $ なし 全データで勾配計算
確率的勾配降下法(SGD) $ \theta_{t+1} = \theta_t - \eta \cdot \nabla_\theta L_i(\theta_t) $ なし 1サンプルで更新
Momentum $ \theta_{t+1} = \theta_t + v_t $ $ v_t = \alpha v_{t-1} - \eta \cdot \nabla_\theta L(\theta_t) $ 慣性を付与
NAG $ \theta_{t+1} = \theta_t + v_t $ $ v_t = \alpha v_{t-1} - \eta \cdot \nabla_\theta L(\theta_t + \alpha v_{t-1}) $ 先読み勾配
AdaGrad $ \theta_{t+1} = \theta_t - \eta \cdot \dfrac{1}{\sqrt{h_t} + \epsilon} \cdot \nabla_\theta L(\theta_t) $ $ h_t = h_{t-1} + \left( \nabla_\theta L(\theta_t) \right)^2 $ 学習率が単調減少
RMSProp $ \theta_{t+1} = \theta_t - \eta \cdot \dfrac{1}{\sqrt{h_t} + \epsilon} \cdot \nabla_\theta L(\theta_t) $ $ h_t = \rho h_{t-1} + (1-\rho)\left( \nabla_\theta L(\theta_t) \right)^2 $ 指数移動平均
Adam $ \theta_{t+1} = \theta_t - \eta \cdot \dfrac{\hat m_t}{\sqrt{\hat v_t} + \epsilon} $ $ m_t = \beta_1 m_{t-1} + (1-\beta_1)\nabla_\theta L(\theta_t) $
$ v_t = \beta_2 v_{t-1} + (1-\beta_2)\left(\nabla_\theta L(\theta_t)\right)^2 $
$ \hat m_t = \dfrac{m_t}{1-\beta_1^t} $
$ \hat v_t = \dfrac{v_t}{1-\beta_2^t} $
Momentum + RMSProp

画像版

Qiitaでは横幅が潰れてしまうので

image.png

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?