0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

Dropoutの効果を可視化してみる

Last updated at Posted at 2020-03-29
Dropoutの効果を映画レビューの分類データを使って可視化してみる。
Dropoutのイメージ

引用:https://deepage.net/deep_learning/2016/10/17/deeplearning_dropout.html

image.png

データの読み込みと前処理

from keras.datasets import imdb
(train_data, train_labels), (test_data, test_labels) = imdb.load_data(num_words=10000)

# データのベクトル化
import numpy as np
def vectorize_sequences(sequences, dimension=10000):
    
    results = np.zeros((len(sequences), dimension))
    
    for i, sequence in enumerate(sequences):
        results[i, sequence] = 1.
    return results

X_train = vectorize_sequences(train_data)
X_test = vectorize_sequences(test_data)

y_train = np.asarray(train_labels).astype('float32')
y_test = np.asarray(test_labels).astype('float32')

# 訓練データと検証データに分ける
x_val = X_train[:10000]
x_train_partial = X_train[10000:]

y_val = y_train[:10000]
y_train_partial = y_train[10000:]

続いてモデルの定義

from keras import models
from keras import layers

model1 = models.Sequential()
model1.add(layers.Dense(16, activation='relu', input_shape=(10000,)))
model1.add(layers.Dense(16, activation='relu'))
model1.add(layers.Dense(1, activation='sigmoid'))

# コンパイル
from keras import losses
from keras import metrics
from keras import optimizers

model1.compile(optimizer=optimizers.RMSprop(lr=0.001), 
            loss=losses.binary_crossentropy, 
            metrics=[metrics.binary_accuracy])

そして学習

history1 = model1.fit(x_train_partial,
                     y_train_partial,
                     epochs=20,
                     batch_size=512,
                     validation_data = (x_val, y_val))

可視化

import matplotlib.pyplot as plt

history_dict1 = history1.history
loss_values1 = history_dict1['loss']
val_loss_values1 = history_dict1['val_loss']

epochs = range(1, len(loss_values1) + 1)

plt.plot(epochs, loss_values1, 'bo', label = 'Training loss')
plt.plot(epochs, val_loss_values1, 'b', label = 'Validation loss')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Loss')
plt.legend()
plt.show()

image.png

ここからが本題、Dropoutの効果を可視化してみる

改良モデルの定義、そしてコンパイル、学習

# 元の層にDropout項を追加してみる
model2 = models.Sequential()
model2.add(layers.Dense(16, activation='relu', input_shape=(10000,)))
model2.add(layers.Dropout(0.5))
model2.add(layers.Dense(16, activation='relu'))
model2.add(layers.Dropout(0.5))
model2.add(layers.Dense(1, activation='sigmoid'))

model2.compile(optimizer=optimizers.RMSprop(lr=0.001), 
            loss=losses.binary_crossentropy, 
            metrics=[metrics.binary_accuracy])

history2 = model2.fit(x_train_partial,
                     y_train_partial,
                     epochs=20,
                     batch_size=512,
                     validation_data = (x_val, y_val))

最後にオリジナルモデルとの比較を可視化

history_dict2 = history2.history
loss_values2 = history_dict2['loss']
val_loss_values2 = history_dict2['val_loss']

epochs = range(1, len(loss_values2) + 1)


plt.plot(epochs, val_loss_values1, '+', label = 'Original model')
plt.plot(epochs, val_loss_values2, 'x', label = 'Dropout model')
plt.title('Training and validation loss')
plt.xlabel('Epochs')
plt.ylabel('Validation Loss')
plt.legend()
plt.show()

image.png

Validation lossが低下していることがわかる。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?