2
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

TensorFlow2 忘備録: 自分でトレーニングループを作るときに重み正則化 weight regularizerを利用する.

Posted at

TensorFlow2 でレイヤーを作るとき,下記の様にkernel_regularizerを指定すると正則化のためのペナルティ項の計算ができる. (L2 weight regularization は別名 weight decayとも呼ばれる). 他の正則化の説明については今回は省略する.

layers.Dense(512, activation='elu',
                 kernel_regularizer=regularizers.l2(0.001))

tf.kerasの model.compilemodel.fit を使う場合は,自動的にこれらの正則化項を含めて勾配を計算して学習が進む.

トレーニングループをkerasに頼らず自分で書くときは次の様にする.公式チュートリアルを参考にした.

モデルに含まれる正則化項にはmodel.lossesでアクセスできるので,それらの和を求め,
他の損失との総和をとったものの勾配を計算すればよいようだ.

  @tf.function
  def train_step(images, labels):
    with tf.GradientTape() as tape:
      predictions = model(images, training=True)
      loss = loss_fn(labels, predictions) #出力に関するloss
      reg_loss = tf.add_n(model.losses) # 正則化のLoss 
      total_loss = reg_loss + loss
    gradients = tape.gradient(total_loss, model.trainable_variables) # 勾配を計算
    optimizer.apply_gradients(zip(gradients, model.trainable_variables)) #勾配により学習

2
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?