4
3

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

【 CIFAR-10】 - Object Recognition in Imagesで入力サイズのUpconversionによりVGG16モデルで95.94%♬

Last updated at Posted at 2019-04-04

ほぼ1年ぶりにこの競技をやってみた。
昨年は、Caps-net(小さなモデル)で90%超えを目標にやって達成した。
今年は、秘策「大きなまゆゆ」のUpconversionの技術を利用して、Cifar10のあの小さな入力を大きくしたら、人間の目でも見やすいので、きっと良いスコアが得られるのではという予測でやってみた。
その結果、VGG16のFinetuning(全層)で再現性よく95%以上が得られたのでまとめておく。
また、アプリ内であるが、モデルへの入力以前、事前に画像を拡大しているので、今回は学習時もAugmentationありで実施している。因みにこの95.94%は参考①の4年前のKaggleのトップデータより高いスコアである。

因みに、今の世界記録はmxnet(resnet.pydensenet.pyアンサンブル)による参考③の96.98%みたいで、参考④にコードがあります。
最近、参考⑤のResidualAttentionNetworkでは97.78%が出ているようだ。また、さらに参考⑥によれば99.0%を得ているようだ。なお、現在のランキングが参考⑦にある。
【参考】
・①CIFAR-10 - Object Recognition in Images@Kaggle
・②ぶっちゃけ変なモデル。。。^^;
・③Easily get 96.98% accuracy using MXNet@kaggle
・④jamesliu/kaggle_cifar-10_mxnet
・⑤A Gluon implement of Residual Attention Network. Best acc on cifar10-97.78%.
・⑥GPipe: Efficient Training of Giant Neural Networks using Pipeline Parallelism
・⑦CIFAR-10 link Classify 32x32 colour images into 10 categories.@Benchmarks.AI

やったこと

・Cifar10の画像を拡大する
・学習
・結果

・Cifar10の画像を拡大する

これは前回のまゆゆの画像拡大と同じ手法で実施した。
つまり、前回の結果を見る限り以下のOpenCVで拡大するのが高速に綺麗な画像が得られる。

dst = cv2.resize(x_train[i], (img_rows, img_cols), interpolation=cv2.INTER_CUBIC) 

ということで、以下のようにCifar10の60,000個のデータをすべて変換した。

img_rows=128
img_cols=128
(x_train, y_train), (x_test, y_test) = cifar10.load_data()

X_train =[]
X_test = []
for i in range(50000):
    dst = cv2.resize(x_train[i], (img_rows, img_cols), interpolation=cv2.INTER_CUBIC)     
    X_train.append(dst)
for i in range(10000):
    dst = cv2.resize(x_test[i], (img_rows, img_cols), interpolation=cv2.INTER_CUBIC)
    X_test.append(dst)
X_train = np.array(X_train)
X_test = np.array(X_test)

# 上記のfor文の個数と以下のコードで学習するデータの個数と検証用データの個数を調整
y_train=y_train[:50000]
y_test=y_test[:10000]

x_train = X_train.astype('float32')
x_test = X_test.astype('float32')
x_train /= 255
x_test /= 255
# Convert class vectors to binary class matrices.
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)

この後は通常のFinetuningと同じコードです。

# VGG16モデルと学習済み重みをロード
# Fully-connected層(FC)はいらないのでinclude_top=False)
input_tensor = Input(shape=x_train.shape[1:]) 
vgg16 = VGG16(include_top=False, weights='imagenet', input_tensor=input_tensor)

# FC層を構築
top_model = Sequential()
top_model.add(Flatten(input_shape=vgg16.output_shape[1:])) 
top_model.add(Dense(256, activation='relu'))
top_model.add(Dropout(0.75))  #過学習を防ぐために0.75を採用
top_model.add(Dense(num_classes, activation='softmax'))

# VGG16とFCを接続
model = Model(input=vgg16.input, output=top_model(vgg16.output))

# 最後のconv層の直前までの層をfreeze⇒今回は全層をFinetuningする
# trainingするlayerを指定 VGG16では18,15,10,1など 20で全層固定
for layer in model.layers[1:1]:  #全層でFinetuning
    layer.trainable = False
# Fine-tuningのときはSGDの方がよい⇒adamがよかった
lr = 0.00001
opt = keras.optimizers.Adam(lr, beta_1=0.5, beta_2=0.999, epsilon=1e-08, decay=1e-6)
model.compile(loss='categorical_crossentropy',
              optimizer=opt,
              metrics=['accuracy'])
# モデルのサマリを表示
model.summary()
model.save_weights('params_initial_model.hdf5', True) 
# model.load_weights('params_model_epoch_karasu_a2017.hdf5')

コードは以下に置いた

SPP/VGG16_finetuning.py

学習

学習は、以下のようにinput_shapeとして、(32,32,3)、(64,64,3)、(128,128,3)、(160,160,3)について実施した。
※残念ながらウワンの環境(1080マシン8GB)だと、これ以上サイズが大きいとメモリー不足のため停止してしまう
それでも、入力画像のサイズを大きくしたときの効果は覿面で(32,32,3)では、0.8667で昨年の記録にも満たないが、(128,128,3)では0.9514となり、さらに(160,160,3)では0.9594までVal_accが上がった。
なお、学習時、Irを10epoch毎にlr=lr*0.8を採用している。

(32,32,3)
i, ir=  90 1.342177280000001e-06
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 40s 101ms/step - loss: 0.1780 - acc: 0.9436 - val_loss: 0.4941 - val_acc: 0.8652
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1798 - acc: 0.9432 - val_loss: 0.5052 - val_acc: 0.8627
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1793 - acc: 0.9429 - val_loss: 0.4958 - val_acc: 0.8639
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1797 - acc: 0.9434 - val_loss: 0.4974 - val_acc: 0.8667
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1740 - acc: 0.9454 - val_loss: 0.5039 - val_acc: 0.8611
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1740 - acc: 0.9446 - val_loss: 0.4965 - val_acc: 0.8660
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1726 - acc: 0.9450 - val_loss: 0.4916 - val_acc: 0.8660
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1716 - acc: 0.9457 - val_loss: 0.5051 - val_acc: 0.8634
Using real-time data augmentation.
Epoch 1/1
390/390 [==============================] - 38s 98ms/step - loss: 0.1711 - acc: 0.9455 - val_loss: 0.4939 - val_acc: 0.8654
(128,128,3)
Epoch 108
390/390 [==============================] - 190s 488ms/step - loss: 0.0084 - acc: 0.9977 - val_loss: 0.3018 - val_acc: 0.9514
(160,160,3)
Epoch 51
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0056 - acc: 0.9984 - val_loss: 0.3033 - val_acc: 0.9523

上記のweightを使って再学習後50epochから表示すると

i, ir=  50 3.276800000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 341s 218ms/step - loss: 0.0028 - acc: 0.9991 - val_loss: 0.3413 - val_acc: 0.9501
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0023 - acc: 0.9995 - val_loss: 0.3061 - val_acc: 0.9561
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 352s 225ms/step - loss: 0.0020 - acc: 0.9994 - val_loss: 0.3135 - val_acc: 0.9575
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0028 - acc: 0.9993 - val_loss: 0.3111 - val_acc: 0.9568
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 326s 209ms/step - loss: 0.0013 - acc: 0.9997 - val_loss: 0.3160 - val_acc: 0.9575
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0016 - acc: 0.9996 - val_loss: 0.3371 - val_acc: 0.9551
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 346s 221ms/step - loss: 0.0021 - acc: 0.9994 - val_loss: 0.3158 - val_acc: 0.9545
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0018 - acc: 0.9995 - val_loss: 0.3376 - val_acc: 0.9560
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 344s 220ms/step - loss: 0.0012 - acc: 0.9996 - val_loss: 0.3173 - val_acc: 0.9585
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0027 - acc: 0.9993 - val_loss: 0.3391 - val_acc: 0.9540
i, ir=  60 2.6214400000000015e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 359s 230ms/step - loss: 0.0015 - acc: 0.9996 - val_loss: 0.3358 - val_acc: 0.9545
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 332s 212ms/step - loss: 0.0011 - acc: 0.9998 - val_loss: 0.3706 - val_acc: 0.9519
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.0011 - acc: 0.9998 - val_loss: 0.3391 - val_acc: 0.9573
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 327s 210ms/step - loss: 7.0502e-04 - acc: 0.9997 - val_loss: 0.3274 - val_acc: 0.9590
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 215ms/step - loss: 0.0017 - acc: 0.9995 - val_loss: 0.3675 - val_acc: 0.9495
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 327s 209ms/step - loss: 0.0019 - acc: 0.9994 - val_loss: 0.2978 - val_acc: 0.9594

まとめ

・VGG16モデルのCifar10の入力サイズを変えて学習した
・スコアは入力サイズ依存性を示し、(160,160,3)のとき最高の識別率95.94%を記録した
・今回の結果は、再現性よく、だれでも入力サイズを変更すればよい結果を得られます

・今回、識別できなかった画像を分類する
・他のモデルで入力サイズを変更してみる
・世界は2から3個のモデルのアンサンブルでよい結果を得ているので、それも試みたいと思う

おまけ

vgg16 = VGG16(include_top=False, weights='imagenet', input_tensor=input_tensor)

# FC層を構築
top_model = Sequential()
top_model.add(Flatten(input_shape=vgg16.output_shape[1:])) 
top_model.add(Dense(256, activation='relu'))
top_model.add(Dropout(0.75))
top_model.add(Dense(num_classes, activation='softmax'))

  model = Model(input=vgg16.input, output=top_model(vgg16.output))
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_1 (InputLayer)         (None, 128, 128, 3)       0
_________________________________________________________________
block1_conv1 (Conv2D)        (None, 128, 128, 64)      1792
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 128, 128, 64)      36928
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 64, 64, 64)        0
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 64, 64, 128)       73856
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 64, 64, 128)       147584
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 32, 32, 128)       0
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 32, 32, 256)       295168
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 32, 32, 256)       590080
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 32, 32, 256)       590080
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 16, 16, 256)       0
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 16, 16, 512)       1180160
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 16, 16, 512)       2359808
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 16, 16, 512)       2359808
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 8, 8, 512)         0
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 8, 8, 512)         2359808
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 8, 8, 512)         2359808
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 8, 8, 512)         2359808
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 4, 4, 512)         0
_________________________________________________________________
sequential_1 (Sequential)    (None, 10)                2099978
=================================================================
Total params: 16,814,666
Trainable params: 16,814,666
Non-trainable params: 0
_________________________________________________________________
Using real-time data augmentation.
Epoch 1/100
390/390 [==============================] - 193s 495ms/step - loss: 1.5815 - acc: 0.4454 - val_loss: 0.7854 - val_acc: 0.7424
Epoch 2/100
390/390 [==============================] - 187s 478ms/step - loss: 0.8873 - acc: 0.7049 - val_loss: 0.5495 - val_acc: 0.8160
Epoch 3/100
390/390 [==============================] - 187s 479ms/step - loss: 0.6614 - acc: 0.7854 - val_loss: 0.4425 - val_acc: 0.8485
Epoch 4/100
390/390 [==============================] - 191s 489ms/step - loss: 0.5335 - acc: 0.8306 - val_loss: 0.4000 - val_acc: 0.8657
Epoch 5/100
390/390 [==============================] - 187s 480ms/step - loss: 0.4470 - acc: 0.8554 - val_loss: 0.3186 - val_acc: 0.8920
Epoch 6/100
390/390 [==============================] - 190s 486ms/step - loss: 0.3887 - acc: 0.8773 - val_loss: 0.2980 - val_acc: 0.9029
Epoch 7/100
390/390 [==============================] - 190s 487ms/step - loss: 0.3485 - acc: 0.8898 - val_loss: 0.2871 - val_acc: 0.9078
Epoch 8/100
390/390 [==============================] - 188s 482ms/step - loss: 0.3080 - acc: 0.9018 - val_loss: 0.3108 - val_acc: 0.9008
Epoch 9/100
390/390 [==============================] - 188s 481ms/step - loss: 0.2768 - acc: 0.9125 - val_loss: 0.2571 - val_acc: 0.9158
Epoch 10/100
390/390 [==============================] - 191s 490ms/step - loss: 0.2518 - acc: 0.9203 - val_loss: 0.2516 - val_acc: 0.9193
batch_size = 32 #128 #32
num_classes = 10
epochs = 100
data_augmentation = True #True #False
img_rows=160 #128 #32
img_cols=160 #128 #32
result_dir="./history"

# The data, shuffled and split between train and test sets:
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
# x_train,y_train,x_test,y_test = getDataSet(img_rows,img_cols)

X_train =[]
X_test = []
for i in range(50000):
    dst = cv2.resize(x_train[i], (img_rows, img_cols), interpolation=cv2.INTER_CUBIC) #cv2.INTER_LINEAR #cv2.INTER_CUBIC
    X_train.append(dst)
for i in range(10000):
    dst = cv2.resize(x_test[i], (img_rows, img_cols), interpolation=cv2.INTER_CUBIC)
    X_test.append(dst)
X_train = np.array(X_train)
X_test = np.array(X_test)

y_train=y_train[:50000]
y_test=y_test[:10000]
print(X_train.shape, y_train.shape)
print(X_test.shape, y_test.shape)

x_train = X_train.astype('float32')
x_test = X_test.astype('float32')
x_train /= 255
x_test /= 255

print('x_train shape:', x_train.shape)
print(x_train.shape[0], 'train samples')
print(x_test.shape[0], 'test samples')

# Convert class vectors to binary class matrices.
y_train = keras.utils.to_categorical(y_train, num_classes)
y_test = keras.utils.to_categorical(y_test, num_classes)

# VGG16モデルと学習済み重みをロード
# Fully-connected層(FC)はいらないのでinclude_top=False)
input_tensor = Input(shape=x_train.shape[1:]) 
vgg16 = VGG16(include_top=False, weights='imagenet', input_tensor=input_tensor)

# FC層を構築
top_model = Sequential()
top_model.add(Flatten(input_shape=vgg16.output_shape[1:])) 
top_model.add(Dense(256, activation='relu'))
top_model.add(Dropout(0.75))
top_model.add(Dense(num_classes, activation='softmax'))

# VGG16とFCを接続
model = Model(input=vgg16.input, output=top_model(vgg16.output))

# 最後のconv層の直前までの層をfreeze
# trainingするlayerを指定 VGG16では18,15,10,1など 20で全層固定
for layer in model.layers[1:1]:  
    layer.trainable = False

# Fine-tuningのときはSGDの方がよい⇒adamがよかった
lr = 0.00001
opt = keras.optimizers.Adam(lr, beta_1=0.5, beta_2=0.999, epsilon=1e-08, decay=1e-6)
# opt = keras.optimizers.SGD(lr=1e-4, momentum=0.9)
model.compile(loss='categorical_crossentropy',
              optimizer=opt,
              metrics=['accuracy'])

# モデルのサマリを表示
model.summary()
model.save_weights('params_initial_model.hdf5', True) 
model.load_weights('params_model_epoch_karasu_a2055.hdf5',by_name=True)
_________________________________________________________________
Layer (type)                 Output Shape              Param #
=================================================================
input_1 (InputLayer)         (None, 160, 160, 3)       0
_________________________________________________________________
block1_conv1 (Conv2D)        (None, 160, 160, 64)      1792
_________________________________________________________________
block1_conv2 (Conv2D)        (None, 160, 160, 64)      36928
_________________________________________________________________
block1_pool (MaxPooling2D)   (None, 80, 80, 64)        0
_________________________________________________________________
block2_conv1 (Conv2D)        (None, 80, 80, 128)       73856
_________________________________________________________________
block2_conv2 (Conv2D)        (None, 80, 80, 128)       147584
_________________________________________________________________
block2_pool (MaxPooling2D)   (None, 40, 40, 128)       0
_________________________________________________________________
block3_conv1 (Conv2D)        (None, 40, 40, 256)       295168
_________________________________________________________________
block3_conv2 (Conv2D)        (None, 40, 40, 256)       590080
_________________________________________________________________
block3_conv3 (Conv2D)        (None, 40, 40, 256)       590080
_________________________________________________________________
block3_pool (MaxPooling2D)   (None, 20, 20, 256)       0
_________________________________________________________________
block4_conv1 (Conv2D)        (None, 20, 20, 512)       1180160
_________________________________________________________________
block4_conv2 (Conv2D)        (None, 20, 20, 512)       2359808
_________________________________________________________________
block4_conv3 (Conv2D)        (None, 20, 20, 512)       2359808
_________________________________________________________________
block4_pool (MaxPooling2D)   (None, 10, 10, 512)       0
_________________________________________________________________
block5_conv1 (Conv2D)        (None, 10, 10, 512)       2359808
_________________________________________________________________
block5_conv2 (Conv2D)        (None, 10, 10, 512)       2359808
_________________________________________________________________
block5_conv3 (Conv2D)        (None, 10, 10, 512)       2359808
_________________________________________________________________
block5_pool (MaxPooling2D)   (None, 5, 5, 512)         0
_________________________________________________________________
sequential_1 (Sequential)    (None, 10)                3279626
=================================================================
Total params: 17,994,314
Trainable params: 17,994,314
Non-trainable params: 0
_________________________________________________________________
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 214ms/step - loss: 1.2563 - acc: 0.5662 - val_loss: 0.5514 - val_acc: 0.8181
i, ir=  0 1e-05
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 349s 224ms/step - loss: 0.6307 - acc: 0.7958 - val_loss: 0.3822 - val_acc: 0.8771
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 212ms/step - loss: 0.4702 - acc: 0.8492 - val_loss: 0.3010 - val_acc: 0.9027
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.3815 - acc: 0.8791 - val_loss: 0.2760 - val_acc: 0.9069
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 338s 217ms/step - loss: 0.3242 - acc: 0.8974 - val_loss: 0.2443 - val_acc: 0.9172
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 214ms/step - loss: 0.2785 - acc: 0.9122 - val_loss: 0.2870 - val_acc: 0.9068
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.2393 - acc: 0.9248 - val_loss: 0.2144 - val_acc: 0.9301
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 217ms/step - loss: 0.2139 - acc: 0.9306 - val_loss: 0.2178 - val_acc: 0.9307
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.1898 - acc: 0.9400 - val_loss: 0.2169 - val_acc: 0.9296
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 333s 213ms/step - loss: 0.1675 - acc: 0.9463 - val_loss: 0.2166 - val_acc: 0.9331
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 336s 215ms/step - loss: 0.1464 - acc: 0.9525 - val_loss: 0.2048 - val_acc: 0.9373
i, ir=  10 8.000000000000001e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.1224 - acc: 0.9612 - val_loss: 0.2018 - val_acc: 0.9370
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 334s 214ms/step - loss: 0.1094 - acc: 0.9640 - val_loss: 0.1945 - val_acc: 0.9408
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 331s 212ms/step - loss: 0.1005 - acc: 0.9676 - val_loss: 0.2100 - val_acc: 0.9386
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 341s 218ms/step - loss: 0.0870 - acc: 0.9728 - val_loss: 0.2758 - val_acc: 0.9285
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 212ms/step - loss: 0.0806 - acc: 0.9742 - val_loss: 0.2047 - val_acc: 0.9427
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0735 - acc: 0.9769 - val_loss: 0.2220 - val_acc: 0.9401
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 352s 225ms/step - loss: 0.0672 - acc: 0.9778 - val_loss: 0.2039 - val_acc: 0.9439
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 347s 222ms/step - loss: 0.0583 - acc: 0.9811 - val_loss: 0.2265 - val_acc: 0.9433
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 210ms/step - loss: 0.0542 - acc: 0.9831 - val_loss: 0.2252 - val_acc: 0.9416
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 332s 212ms/step - loss: 0.0495 - acc: 0.9842 - val_loss: 0.2293 - val_acc: 0.9438
i, ir=  20 6.400000000000001e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 338s 217ms/step - loss: 0.0418 - acc: 0.9871 - val_loss: 0.2486 - val_acc: 0.9400
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 323s 207ms/step - loss: 0.0364 - acc: 0.9886 - val_loss: 0.2697 - val_acc: 0.9367
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 331s 212ms/step - loss: 0.0341 - acc: 0.9900 - val_loss: 0.2714 - val_acc: 0.9366
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 337s 216ms/step - loss: 0.0319 - acc: 0.9901 - val_loss: 0.2337 - val_acc: 0.9433
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 332s 212ms/step - loss: 0.0308 - acc: 0.9904 - val_loss: 0.2372 - val_acc: 0.9457
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 327s 209ms/step - loss: 0.0285 - acc: 0.9914 - val_loss: 0.2433 - val_acc: 0.9470
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 333s 213ms/step - loss: 0.0273 - acc: 0.9914 - val_loss: 0.2627 - val_acc: 0.9421
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0237 - acc: 0.9928 - val_loss: 0.2590 - val_acc: 0.9454
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 345s 221ms/step - loss: 0.0229 - acc: 0.9930 - val_loss: 0.2522 - val_acc: 0.9452
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 351s 225ms/step - loss: 0.0227 - acc: 0.9934 - val_loss: 0.3140 - val_acc: 0.9366
i, ir=  30 5.120000000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 336s 215ms/step - loss: 0.0179 - acc: 0.9944 - val_loss: 0.2813 - val_acc: 0.9455
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 347s 222ms/step - loss: 0.0157 - acc: 0.9950 - val_loss: 0.2736 - val_acc: 0.9454
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 338s 216ms/step - loss: 0.0179 - acc: 0.9945 - val_loss: 0.2816 - val_acc: 0.9454
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 348s 223ms/step - loss: 0.0155 - acc: 0.9952 - val_loss: 0.2478 - val_acc: 0.9520
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 354s 226ms/step - loss: 0.0141 - acc: 0.9957 - val_loss: 0.2830 - val_acc: 0.9442
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 338s 216ms/step - loss: 0.0146 - acc: 0.9955 - val_loss: 0.2618 - val_acc: 0.9468
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 346s 221ms/step - loss: 0.0138 - acc: 0.9958 - val_loss: 0.2701 - val_acc: 0.9469
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0127 - acc: 0.9961 - val_loss: 0.2520 - val_acc: 0.9499
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 334s 214ms/step - loss: 0.0120 - acc: 0.9966 - val_loss: 0.2887 - val_acc: 0.9480
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 215ms/step - loss: 0.0132 - acc: 0.9962 - val_loss: 0.2700 - val_acc: 0.9471
i, ir=  40 4.096000000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 337s 216ms/step - loss: 0.0109 - acc: 0.9967 - val_loss: 0.2896 - val_acc: 0.9461
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 342s 219ms/step - loss: 0.0090 - acc: 0.9973 - val_loss: 0.2723 - val_acc: 0.9503
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 327s 209ms/step - loss: 0.0094 - acc: 0.9972 - val_loss: 0.2657 - val_acc: 0.9517
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 356s 228ms/step - loss: 0.0091 - acc: 0.9972 - val_loss: 0.2934 - val_acc: 0.9474
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0100 - acc: 0.9968 - val_loss: 0.2702 - val_acc: 0.9486
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 342s 219ms/step - loss: 0.0078 - acc: 0.9976 - val_loss: 0.2965 - val_acc: 0.9474
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 352s 226ms/step - loss: 0.0079 - acc: 0.9976 - val_loss: 0.2892 - val_acc: 0.9513
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.0066 - acc: 0.9979 - val_loss: 0.3675 - val_acc: 0.9425
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 358s 229ms/step - loss: 0.0079 - acc: 0.9976 - val_loss: 0.3056 - val_acc: 0.9486
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 348s 223ms/step - loss: 0.0089 - acc: 0.9977 - val_loss: 0.2916 - val_acc: 0.9506
i, ir=  50 3.276800000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0056 - acc: 0.9984 - val_loss: 0.3033 - val_acc: 0.9523
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 344s 220ms/step - loss: 0.0068 - acc: 0.9980 - val_loss: 0.3053 - val_acc: 0.9488
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 215ms/step - loss: 0.0048 - acc: 0.9986 - val_loss: 0.2858 - val_acc: 0.9508
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 338s 216ms/step - loss: 0.0064 - acc: 0.9982 - val_loss: 0.3031 - val_acc: 0.9506
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 337s 216ms/step - loss: 0.0053 - acc: 0.9984 - val_loss: 0.3033 - val_acc: 0.9502


Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 333s 213ms/step - loss: 0.0329 - acc: 0.9908 - val_loss: 0.4019 - val_acc: 0.9235
i, ir=  0 1e-05
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0257 - acc: 0.9926 - val_loss: 0.3482 - val_acc: 0.9313
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0234 - acc: 0.9930 - val_loss: 0.2796 - val_acc: 0.9467
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 347s 222ms/step - loss: 0.0227 - acc: 0.9934 - val_loss: 0.2716 - val_acc: 0.9459
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 218ms/step - loss: 0.0247 - acc: 0.9929 - val_loss: 0.2917 - val_acc: 0.9419
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 347s 222ms/step - loss: 0.0209 - acc: 0.9942 - val_loss: 0.2468 - val_acc: 0.9439
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0208 - acc: 0.9939 - val_loss: 0.2665 - val_acc: 0.9486
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 341s 218ms/step - loss: 0.0200 - acc: 0.9941 - val_loss: 0.2728 - val_acc: 0.9466
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 342s 219ms/step - loss: 0.0203 - acc: 0.9942 - val_loss: 0.2736 - val_acc: 0.9448
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.0182 - acc: 0.9944 - val_loss: 0.3146 - val_acc: 0.9463
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 218ms/step - loss: 0.0216 - acc: 0.9937 - val_loss: 0.2627 - val_acc: 0.9488
i, ir=  10 8.000000000000001e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0122 - acc: 0.9962 - val_loss: 0.2704 - val_acc: 0.9520
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 218ms/step - loss: 0.0140 - acc: 0.9958 - val_loss: 0.3107 - val_acc: 0.9399
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 345s 221ms/step - loss: 0.0143 - acc: 0.9960 - val_loss: 0.3049 - val_acc: 0.9435
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 210ms/step - loss: 0.0118 - acc: 0.9967 - val_loss: 0.2673 - val_acc: 0.9524
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 344s 220ms/step - loss: 0.0129 - acc: 0.9962 - val_loss: 0.2867 - val_acc: 0.9463
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 218ms/step - loss: 0.0110 - acc: 0.9966 - val_loss: 0.2648 - val_acc: 0.9519
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 215ms/step - loss: 0.0127 - acc: 0.9962 - val_loss: 0.3145 - val_acc: 0.9433
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 337s 215ms/step - loss: 0.0128 - acc: 0.9963 - val_loss: 0.2980 - val_acc: 0.9495
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 348s 223ms/step - loss: 0.0095 - acc: 0.9970 - val_loss: 0.2728 - val_acc: 0.9499
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 323s 207ms/step - loss: 0.0132 - acc: 0.9963 - val_loss: 0.2742 - val_acc: 0.9511
i, ir=  20 6.400000000000001e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 218ms/step - loss: 0.0076 - acc: 0.9978 - val_loss: 0.3079 - val_acc: 0.9489
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0078 - acc: 0.9976 - val_loss: 0.3446 - val_acc: 0.9459
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.0086 - acc: 0.9975 - val_loss: 0.2852 - val_acc: 0.9507
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 331s 212ms/step - loss: 0.0064 - acc: 0.9980 - val_loss: 0.3968 - val_acc: 0.9397
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0093 - acc: 0.9972 - val_loss: 0.3213 - val_acc: 0.9477
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 324s 207ms/step - loss: 0.0068 - acc: 0.9978 - val_loss: 0.3053 - val_acc: 0.9504
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 332s 212ms/step - loss: 0.0059 - acc: 0.9986 - val_loss: 0.2899 - val_acc: 0.9496
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 348s 223ms/step - loss: 0.0077 - acc: 0.9977 - val_loss: 0.3163 - val_acc: 0.9510
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 324s 208ms/step - loss: 0.0054 - acc: 0.9985 - val_loss: 0.3049 - val_acc: 0.9539
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0068 - acc: 0.9978 - val_loss: 0.3232 - val_acc: 0.9502
i, ir=  30 5.120000000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 344s 220ms/step - loss: 0.0060 - acc: 0.9983 - val_loss: 0.3239 - val_acc: 0.9470
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 214ms/step - loss: 0.0043 - acc: 0.9988 - val_loss: 0.3217 - val_acc: 0.9526
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0064 - acc: 0.9986 - val_loss: 0.3093 - val_acc: 0.9524
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 341s 219ms/step - loss: 0.0051 - acc: 0.9984 - val_loss: 0.2873 - val_acc: 0.9516
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0032 - acc: 0.9992 - val_loss: 0.3477 - val_acc: 0.9506
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 336s 215ms/step - loss: 0.0059 - acc: 0.9983 - val_loss: 0.2986 - val_acc: 0.9502
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0057 - acc: 0.9983 - val_loss: 0.2976 - val_acc: 0.9531
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0041 - acc: 0.9988 - val_loss: 0.2976 - val_acc: 0.9539
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 331s 212ms/step - loss: 0.0035 - acc: 0.9992 - val_loss: 0.3026 - val_acc: 0.9532
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0040 - acc: 0.9989 - val_loss: 0.3169 - val_acc: 0.9520
i, ir=  40 4.096000000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 348s 223ms/step - loss: 0.0037 - acc: 0.9990 - val_loss: 0.3323 - val_acc: 0.9515
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 214ms/step - loss: 0.0032 - acc: 0.9991 - val_loss: 0.3352 - val_acc: 0.9514
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 334s 214ms/step - loss: 0.0034 - acc: 0.9992 - val_loss: 0.3041 - val_acc: 0.9499
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 360s 231ms/step - loss: 0.0033 - acc: 0.9991 - val_loss: 0.3155 - val_acc: 0.9510
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 0.0030 - acc: 0.9990 - val_loss: 0.3202 - val_acc: 0.9528
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 337s 216ms/step - loss: 0.0026 - acc: 0.9992 - val_loss: 0.3327 - val_acc: 0.9526
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 352s 225ms/step - loss: 0.0027 - acc: 0.9993 - val_loss: 0.3024 - val_acc: 0.9560
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 346s 221ms/step - loss: 0.0042 - acc: 0.9989 - val_loss: 0.3326 - val_acc: 0.9534
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0034 - acc: 0.9991 - val_loss: 0.3082 - val_acc: 0.9524
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0027 - acc: 0.9994 - val_loss: 0.3539 - val_acc: 0.9503
i, ir=  50 3.276800000000002e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 341s 218ms/step - loss: 0.0028 - acc: 0.9991 - val_loss: 0.3413 - val_acc: 0.9501
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0023 - acc: 0.9995 - val_loss: 0.3061 - val_acc: 0.9561
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 352s 225ms/step - loss: 0.0020 - acc: 0.9994 - val_loss: 0.3135 - val_acc: 0.9575
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0028 - acc: 0.9993 - val_loss: 0.3111 - val_acc: 0.9568
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 326s 209ms/step - loss: 0.0013 - acc: 0.9997 - val_loss: 0.3160 - val_acc: 0.9575
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 353s 226ms/step - loss: 0.0016 - acc: 0.9996 - val_loss: 0.3371 - val_acc: 0.9551
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 346s 221ms/step - loss: 0.0021 - acc: 0.9994 - val_loss: 0.3158 - val_acc: 0.9545
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0018 - acc: 0.9995 - val_loss: 0.3376 - val_acc: 0.9560
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 344s 220ms/step - loss: 0.0012 - acc: 0.9996 - val_loss: 0.3173 - val_acc: 0.9585
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 0.0027 - acc: 0.9993 - val_loss: 0.3391 - val_acc: 0.9540
i, ir=  60 2.6214400000000015e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 359s 230ms/step - loss: 0.0015 - acc: 0.9996 - val_loss: 0.3358 - val_acc: 0.9545
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 332s 212ms/step - loss: 0.0011 - acc: 0.9998 - val_loss: 0.3706 - val_acc: 0.9519
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 0.0011 - acc: 0.9998 - val_loss: 0.3391 - val_acc: 0.9573
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 327s 210ms/step - loss: 7.0502e-04 - acc: 0.9997 - val_loss: 0.3274 - val_acc: 0.9590
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 335s 215ms/step - loss: 0.0017 - acc: 0.9995 - val_loss: 0.3675 - val_acc: 0.9495
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 327s 209ms/step - loss: 0.0019 - acc: 0.9994 - val_loss: 0.2978 - val_acc: 0.9594
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 8.9990e-04 - acc: 0.9997 - val_loss: 0.3246 - val_acc: 0.9558
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 330s 211ms/step - loss: 0.0015 - acc: 0.9997 - val_loss: 0.3336 - val_acc: 0.9565
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 331s 212ms/step - loss: 0.0011 - acc: 0.9996 - val_loss: 0.3275 - val_acc: 0.9585
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 340s 217ms/step - loss: 0.0013 - acc: 0.9997 - val_loss: 0.3982 - val_acc: 0.9489
i, ir=  70 2.0971520000000012e-06
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 337s 215ms/step - loss: 7.4697e-04 - acc: 0.9997 - val_loss: 0.3331 - val_acc: 0.9583
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 328s 210ms/step - loss: 4.5198e-04 - acc: 0.9999 - val_loss: 0.3450 - val_acc: 0.9579
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 329s 211ms/step - loss: 7.6168e-04 - acc: 0.9998 - val_loss: 0.3719 - val_acc: 0.9520
Using real-time data augmentation.
Epoch 1/1
1562/1562 [==============================] - 339s 217ms/step - loss: 9.0658e-04 - acc: 0.9997 - val_loss: 0.4198 - val_acc: 0.9476
4
3
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
4
3

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?