Qiita Teams that are logged in
You are not logged in to any team

Community
Service
Qiita JobsQiita ZineQiita Blog
28
Help us understand the problem. What is going on with this article?
@4Ui_iUrz1

# Tensorflowの精度低下はlog(0)のせいだった

More than 3 years have passed since last update.

``````.
.
.
step:67 train:0.894584 test:0.756296
step:68 train:0.900654 test:0.756944
step:69 train:0.897526 test:0.758796
step:70 train:0.361345 test:0.333333
step:71 train:0.361345 test:0.333333
step:72 train:0.361345 test:0.333333
step:73 train:0.361345 test:0.333333
.
.
.
``````

``````(pdb) w1
array([[[[ nan,  nan,  nan, ...,  nan,  nan,  nan],
[ nan,  nan,  nan, ...,  nan,  nan,  nan],
[ nan,  nan,  nan, ...,  nan,  nan,  nan]],

[[ nan,  nan,  nan, ...,  nan,  nan,  nan],
[ nan,  nan,  nan, ...,  nan,  nan,  nan],
[ nan,  nan,  nan, ...,  nan,  nan,  nan]],
.
.
.
``````

NaNになっているということで，"tensorflow nan"で検索してみると解決方法が出てきました．
http://stackoverflow.com/questions/33712178/tensorflow-nan-bug

``````cross_entropy = -tf.reduce_sum(labels*tf.log(y_conv))
``````

このままだとlog(0)になり，NaNが出てくる可能性があります．
そこで，以下のように1e-10~1.0の範囲に正規化してからlogをとることで解決できました．

``````cross_entropy = -tf.reduce_sum(labels*tf.log(tf.clip_by_value(y_conv,1e-10,1.0)))
``````

tf.nn.softmax_cross_entropy_with_logitsという関数があり，それを使って以下のようにした方がいいみたいです． →この方法だとうまくいきませんでした．

28
Help us understand the problem. What is going on with this article?
Why not register and get more from Qiita?
1. We will deliver articles that match you
By following users and tags, you can catch up information on technical fields that you are interested in as a whole
2. you can read useful information later efficiently
By "stocking" the articles you like, you can search right away