22
14

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

BatchNormalizationをモデルに組み込む

Last updated at Posted at 2018-09-09

色々な話を聞くと効果絶大なBatchNormalizationを使ってみました

とりあえず、お魚の本 p.187を参考に
「Affine->BatchNormalization->Relu」
の形でモデルを作りたいと思い

Dense(64, activation='relu')(x)

Denseの中からactivationをどうやって出すんだ?
と10分ほど悩んだので掲載します。
わかってみると、こう書くしかないですね・・・

#import

from keras.layers import Dense, BatchNormalization, Activation

#functionalの場合

x = Dense(64, activation='relu')(x)

x = Dense(64)(x)
x = BatchNormalization()(x)
x = Activation('relu')(x)

#Sequentialの場合

model.add(64, activation='relu')

model.add(Dense(64)
model.add(BatchNormalization())
model.add(Activation('relu'))
22
14
2

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
22
14

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?