https://www.tensorflow.org/versions/master/tutorials/mnist/pros/index.html#deep-mnist-for-experts
において紹介されているReLU。
A unit employing the rectifier is also called a rectified linear unit (ReLU).[4]
ReLUで検索すると用語の説明以外がたくさん検索される。こういう情報の探しにくさはどうにかしたい。
Compared to other functions the usage of ReLU is preferable, because it results in the neural network training several times faster,[35]
[35] Krizhevsky, A.; Sutskever, I.; Hinton, G. E. (2012). "Imagenet classification with deep convolutional neural networks". Advances in Neural Information Processing Systems. 1: 1097–1105.