LoginSignup
0
0

More than 5 years have passed since last update.

TensorFlow > link > ReLU > Compared to other functions the usage of ReLU is preferable, because it results in the neural network training several times faster

Last updated at Posted at 2016-10-09

https://www.tensorflow.org/versions/master/tutorials/mnist/pros/index.html#deep-mnist-for-experts
において紹介されているReLU。

A unit employing the rectifier is also called a rectified linear unit (ReLU).[4]

ReLUで検索すると用語の説明以外がたくさん検索される。こういう情報の探しにくさはどうにかしたい。

Compared to other functions the usage of ReLU is preferable, because it results in the neural network training several times faster,[35]

[35] Krizhevsky, A.; Sutskever, I.; Hinton, G. E. (2012). "Imagenet classification with deep convolutional neural networks". Advances in Neural Information Processing Systems. 1: 1097–1105.

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0