0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

TF-Slim > link > fully_connected() > activation_fn / normalizer_fn / weights_regularizer / biases_regularizer

Last updated at Posted at 2016-12-05
動作環境
GeForce GTX 1070 (8GB)
ASRock Z170M Pro4S [Intel Z170chipset]
Ubuntu 14.04 LTS desktop amd64
TensorFlow v0.11
cuDNN v5.1 for Linux
CUDA v8.0
Python 2.7.6
IPython 5.1.0 -- An enhanced Interactive Python.

sine curveを学習した時のweightとbiasをC言語のツールで使うために出力しようとしている。

slim.fully_connected()の使い方を学ばないといけない。

https://github.com/tensorflow/tensorflow/tree/master/tensorflow/contrib/slim
上記のslim.fully_connectedのリンク先が以下となっている (TensorFlow v0.12)。
https://github.com/tensorflow/tensorflow/blob/r0.12/tensorflow/contrib/layers/python/layers/layers.py

layers.py
...
@add_arg_scope
def fully_connected(inputs,
                    num_outputs,
                    activation_fn=nn.relu,
                    normalizer_fn=None,
                    normalizer_params=None,
                    weights_initializer=initializers.xavier_initializer(),
                    weights_regularizer=None,
                    biases_initializer=init_ops.zeros_initializer,
                    biases_regularizer=None,
                    reuse=None,
                    variables_collections=None,
                    outputs_collections=None,
                    trainable=True,
                    scope=None):
  """Adds a fully connected layer.
  `fully_connected` creates a variable called `weights`, representing a fully
  connected weight matrix, which is multiplied by the `inputs` to produce a
  `Tensor` of hidden units. If a `normalizer_fn` is provided (such as
  `batch_norm`), it is then applied. Otherwise, if `normalizer_fn` is
  None and a `biases_initializer` is provided then a `biases` variable would be
  created and added the hidden units. Finally, if `activation_fn` is not `None`,
  it is applied to the hidden units as well.

こちらで使っているのは以下のようにしている。

hiddens = slim.stack(input_ph, slim.fully_connected, [7,7,7], 
  activation_fn=tf.nn.sigmoid, scope="hidden")
...
prediction = slim.fully_connected(hiddens, 1, activation_fn=tf.nn.sigmoid, scope="output")
...

slim.stack()は複数のslim.fully_connected()の簡略表記。

normalizer_fn, weights_regularizer, biases_regularizerは使っていないことになる。
activation_fnはデフォルトのReLUでなくsigmoidを使っている。

まとめると,3つの中間層と最後の出力層、それぞれにおいてweightがかけられて、biasが加算され、sigmoidを適用されている。

weight, biasの値を使ってC実装(python実装?)で出力の計算をするときは同じように処理をすればいいはず。

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?