LoginSignup
1
4

More than 5 years have passed since last update.

メモ:Tensorflowを自動微分用ツールとして使う.

Posted at

sess.run()

定番のやり方.

import tensorflow as tf
X = tf.placeholder(tf.float32, shape=[None, 1], name="x")
Y = X**2
gradY =  tf.gradients(Y, [X])
init = tf.global_variables_initializer()
data = np.linspace(-2,2,100).reshape(-1,1)
sess = tf.Session()
sess.run(init)
ret = sess.run(gradY, feed_dict={X:data})

tf.eager_execution()

sessionをいちいち実行するのはあまりイケていない感じもする.このような場合TensorFlowのEager Executionを使える.
https://www.tensorflow.org/api_docs/python/tf/GradientTape
勾配を計算したい場合,Gradient Tapeをつかって,fowardしたときの値を記録しておくようにしないといけない.

import numpy as np
import tensorflow as tf
import tensorflow.contrib.eager as tfe

tfe.enable_eager_execution()
x = tf.constant(np.linspace(-2,2,100))
with tf.GradientTape() as g:
  g.watch(x)
  y = x * x
dy_dx = g.gradient(y, x)

1
4
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
4