0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

【今日のアブストラクト】Adam: A Method for Stochastic Optimization【論文 DeepL 翻訳】

Last updated at Posted at 2020-03-18

1 日 1 回 (努力目標) 論文の Abstract を DeepL 翻訳の力を借りて読んでいきます.

この記事は自分用のメモみたいなものです.
ほぼ DeepL 翻訳でお送りします.
間違いがあれば指摘していだだけると嬉しいです.

翻訳元
Adam: A Method for Stochastic Optimization

Abstract

訳文

低次モーメントの適応的推定に基づく確率的目的関数の1次勾配ベース最適化アルゴリズムである Adam を紹介する. この方法は実装が簡単で, 計算効率が高く, 必要なメモリが少なく, 勾配の対角線再スケーリングに不変で, データやパラメータが大きい問題に適している. この手法はまた, 非定常目的や, 非常にノイズの多い, あるいはスパースな勾配を持つ問題にも適してる. ハイパーパラメータは直感的な解釈が可能で, 通常はほとんどチューニングを必要としない. Adam がインスパイアされた関連アルゴリズムとの関連性がいくつか議論されている. また, アルゴリズムの理論的な収束特性を分析し, オンライン凸最適化フレームワークの下で既知の最良の結果に匹敵する収束率の regret 境界を提供する. 実証的な結果は, Adam が実際にうまく機能し, 他の確率的最適化手法と比較して有利であることを示している. 最後に, 無限大ノルムに基づく Adam の変形である AdaMax について説明する.

原文

We introduce Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments. The method is straightforward to implement, is computationally efficient, has little memory requirements, is invariant to diagonal rescaling of the gradients, and is well suited for problems that are large in terms of data and/or parameters. The method is also appropriate for non-stationary objectives and problems with very noisy and/or sparse gradients. The hyper-parameters have intuitive interpretations and typically require little tuning. Some connections to related algorithms, on which Adam was inspired, are discussed. We also analyze the theoretical convergence properties of the algorithm and provide a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework. Empirical results demonstrate that Adam works well in practice and compares favorably to other stochastic optimization methods. Finally, we discuss AdaMax, a variant of Adam based on the infinity norm.

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?