42
39

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

[DL]SoftmaxとCross Entropyって何?

Last updated at Posted at 2016-02-15

式は簡単なんだけど、あれ?何だったかな?って忘れるので書き留めておく

Softmaxの目的

Score(logit)を確率(Probability)にする

Neural Networkで下のY=Wx+bのように、入力に対してWeightをかけてBiasを足して得られるYの値は、正から負まで何でもありなので、これをSoftmaxの式に入れると確率っぽくしてくれる。
確率は、
1. 値が正
2. 総和が1
にならないといけないので、値が正になるように指数関数に通して、総和が1になるように全部足したもので割って正規化する。

指数関数はこんな感じ(Wikipediaより)
193px-Exp.svg.png

Softmax.png

Cross Entropy

CrossEntropy.png

42
39
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
42
39

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?