LoginSignup
0
0

More than 5 years have passed since last update.

kerasで九九

Posted at

概要

kerasで九九、やってみた。

結果

81/81 [==============================] - 0s - loss: 0.0277 - val_loss: 0.0276
Epoch 858/1000
81/81 [==============================] - 0s - loss: 0.0276 - val_loss: 0.0275
Epoch 859/1000
81/81 [==============================] - 0s - loss: 0.0276 - val_loss: 0.0275
Epoch 860/1000
81/81 [==============================] - 0s - loss: 0.0276 - val_loss: 0.0275
Epoch 861/1000
81/81 [==============================] - 0s - loss: 0.0276 - val_loss: 0.0274
Epoch 862/1000
81/81 [==============================] - 0s - loss: 0.0273 - val_loss: 0.0273
Epoch 863/1000
81/81 [==============================] - 0s - loss: 0.0273 - val_loss: 0.0273
Epoch 864/1000
81/81 [==============================] - 0s - loss: 0.0274 - val_loss: 0.0273
Epoch 865/1000
81/81 [==============================] - 0s - loss: 0.0273 - val_loss: 0.0272
Epoch 866/1000
81/81 [==============================] - 0s - loss: 0.0272 - val_loss: 0.0272
Epoch 867/1000
81/81 [==============================] - 0s - loss: 0.0272 - val_loss: 0.0273
Epoch 868/1000
81/81 [==============================] - 0s - loss: 0.0273 - val_loss: 0.0272
Epoch 869/1000
81/81 [==============================] - 0s - loss: 0.0272 - val_loss: 0.0270
Epoch 870/1000
81/81 [==============================] - 0s - loss: 0.0271 - val_loss: 0.0269
Epoch 871/1000
81/81 [==============================] - 0s - loss: 0.0269 - val_loss: 0.0268
Epoch 872/1000
81/81 [==============================] - 0s - loss: 0.0269 - val_loss: 0.0268
Epoch 873/1000
81/81 [==============================] - 0s - loss: 0.0269 - val_loss: 0.0267
Epoch 874/1000
81/81 [==============================] - 0s - loss: 0.0268 - val_loss: 0.0266
Epoch 875/1000
81/81 [==============================] - 0s - loss: 0.0267 - val_loss: 0.0265
Epoch 876/1000
81/81 [==============================] - 0s - loss: 0.0266 - val_loss: 0.0264
Epoch 877/1000
81/81 [==============================] - 0s - loss: 0.0265 - val_loss: 0.0264
Epoch 878/1000
81/81 [==============================] - 0s - loss: 0.0264 - val_loss: 0.0263
Epoch 879/1000
81/81 [==============================] - 0s - loss: 0.0264 - val_loss: 0.0263
Epoch 880/1000
81/81 [==============================] - 0s - loss: 0.0263 - val_loss: 0.0262
Epoch 881/1000
81/81 [==============================] - 0s - loss: 0.0262 - val_loss: 0.0261
Epoch 882/1000
81/81 [==============================] - 0s - loss: 0.0262 - val_loss: 0.0260
Epoch 883/1000
81/81 [==============================] - 0s - loss: 0.0261 - val_loss: 0.0260
Epoch 884/1000
81/81 [==============================] - 0s - loss: 0.0260 - val_loss: 0.0260
Epoch 885/1000
81/81 [==============================] - 0s - loss: 0.0260 - val_loss: 0.0259
Epoch 886/1000
81/81 [==============================] - 0s - loss: 0.0260 - val_loss: 0.0259
Epoch 887/1000
81/81 [==============================] - 0s - loss: 0.0260 - val_loss: 0.0258
Epoch 888/1000
81/81 [==============================] - 0s - loss: 0.0258 - val_loss: 0.0257
Epoch 889/1000
81/81 [==============================] - 0s - loss: 0.0257 - val_loss: 0.0257
Epoch 890/1000
81/81 [==============================] - 0s - loss: 0.0258 - val_loss: 0.0257
Epoch 891/1000
81/81 [==============================] - 0s - loss: 0.0258 - val_loss: 0.0256
Epoch 892/1000
81/81 [==============================] - 0s - loss: 0.0258 - val_loss: 0.0255
Epoch 893/1000
81/81 [==============================] - 0s - loss: 0.0256 - val_loss: 0.0255
Epoch 894/1000
81/81 [==============================] - 0s - loss: 0.0255 - val_loss: 0.0255
Epoch 895/1000
81/81 [==============================] - 0s - loss: 0.0255 - val_loss: 0.0256
Epoch 896/1000
81/81 [==============================] - 0s - loss: 0.0257 - val_loss: 0.0256
Epoch 897/1000
81/81 [==============================] - 0s - loss: 0.0256 - val_loss: 0.0253
Epoch 898/1000
81/81 [==============================] - 0s - loss: 0.0254 - val_loss: 0.0252
Epoch 899/1000
81/81 [==============================] - 0s - loss: 0.0252 - val_loss: 0.0252
Epoch 900/1000
81/81 [==============================] - 0s - loss: 0.0252 - val_loss: 0.0251
Epoch 901/1000
81/81 [==============================] - 0s - loss: 0.0252 - val_loss: 0.0251
Epoch 902/1000
81/81 [==============================] - 0s - loss: 0.0251 - val_loss: 0.0251
Epoch 903/1000
81/81 [==============================] - 0s - loss: 0.0252 - val_loss: 0.0250
Epoch 904/1000
81/81 [==============================] - 0s - loss: 0.0250 - val_loss: 0.0249
Epoch 905/1000
81/81 [==============================] - 0s - loss: 0.0250 - val_loss: 0.0249
Epoch 906/1000
81/81 [==============================] - 0s - loss: 0.0250 - val_loss: 0.0249
Epoch 907/1000
81/81 [==============================] - 0s - loss: 0.0249 - val_loss: 0.0248
Epoch 908/1000
81/81 [==============================] - 0s - loss: 0.0249 - val_loss: 0.0247
Epoch 909/1000
81/81 [==============================] - 0s - loss: 0.0248 - val_loss: 0.0246
Epoch 910/1000
81/81 [==============================] - 0s - loss: 0.0245 - val_loss: 0.0245
Epoch 911/1000
81/81 [==============================] - 0s - loss: 0.0246 - val_loss: 0.0245
Epoch 912/1000
81/81 [==============================] - 0s - loss: 0.0246 - val_loss: 0.0245
Epoch 913/1000
81/81 [==============================] - 0s - loss: 0.0244 - val_loss: 0.0243
Epoch 914/1000
81/81 [==============================] - 0s - loss: 0.0244 - val_loss: 0.0243
Epoch 915/1000
81/81 [==============================] - 0s - loss: 0.0244 - val_loss: 0.0243
Epoch 916/1000
81/81 [==============================] - 0s - loss: 0.0243 - val_loss: 0.0241
Epoch 917/1000
81/81 [==============================] - 0s - loss: 0.0241 - val_loss: 0.0241
Epoch 918/1000
81/81 [==============================] - 0s - loss: 0.0242 - val_loss: 0.0242
Epoch 919/1000
81/81 [==============================] - 0s - loss: 0.0242 - val_loss: 0.0241
Epoch 920/1000
81/81 [==============================] - 0s - loss: 0.0241 - val_loss: 0.0240
Epoch 921/1000
81/81 [==============================] - 0s - loss: 0.0241 - val_loss: 0.0241
Epoch 922/1000
81/81 [==============================] - 0s - loss: 0.0241 - val_loss: 0.0240
Epoch 923/1000
81/81 [==============================] - 0s - loss: 0.0241 - val_loss: 0.0238
Epoch 924/1000
81/81 [==============================] - 0s - loss: 0.0238 - val_loss: 0.0238
Epoch 925/1000
81/81 [==============================] - 0s - loss: 0.0239 - val_loss: 0.0239
Epoch 926/1000
81/81 [==============================] - 0s - loss: 0.0240 - val_loss: 0.0238
Epoch 927/1000
81/81 [==============================] - 0s - loss: 0.0238 - val_loss: 0.0236
Epoch 928/1000
81/81 [==============================] - 0s - loss: 0.0236 - val_loss: 0.0237
Epoch 929/1000
81/81 [==============================] - 0s - loss: 0.0238 - val_loss: 0.0237
Epoch 930/1000
81/81 [==============================] - 0s - loss: 0.0238 - val_loss: 0.0236
Epoch 931/1000
81/81 [==============================] - 0s - loss: 0.0235 - val_loss: 0.0233
Epoch 932/1000
81/81 [==============================] - 0s - loss: 0.0234 - val_loss: 0.0234
Epoch 933/1000
81/81 [==============================] - 0s - loss: 0.0235 - val_loss: 0.0234
Epoch 934/1000
81/81 [==============================] - 0s - loss: 0.0235 - val_loss: 0.0233
Epoch 935/1000
81/81 [==============================] - 0s - loss: 0.0233 - val_loss: 0.0232
Epoch 936/1000
81/81 [==============================] - 0s - loss: 0.0232 - val_loss: 0.0231
Epoch 937/1000
81/81 [==============================] - 0s - loss: 0.0231 - val_loss: 0.0230
Epoch 938/1000
81/81 [==============================] - 0s - loss: 0.0231 - val_loss: 0.0230
Epoch 939/1000
81/81 [==============================] - 0s - loss: 0.0231 - val_loss: 0.0230
Epoch 940/1000
81/81 [==============================] - 0s - loss: 0.0231 - val_loss: 0.0230
Epoch 941/1000
81/81 [==============================] - 0s - loss: 0.0230 - val_loss: 0.0230
Epoch 942/1000
81/81 [==============================] - 0s - loss: 0.0230 - val_loss: 0.0230
Epoch 943/1000
81/81 [==============================] - 0s - loss: 0.0231 - val_loss: 0.0229
Epoch 944/1000
81/81 [==============================] - 0s - loss: 0.0230 - val_loss: 0.0229
Epoch 945/1000
81/81 [==============================] - 0s - loss: 0.0228 - val_loss: 0.0228
Epoch 946/1000
81/81 [==============================] - 0s - loss: 0.0228 - val_loss: 0.0227
Epoch 947/1000
81/81 [==============================] - 0s - loss: 0.0227 - val_loss: 0.0227
Epoch 948/1000
81/81 [==============================] - 0s - loss: 0.0227 - val_loss: 0.0227
Epoch 949/1000
81/81 [==============================] - 0s - loss: 0.0228 - val_loss: 0.0227
Epoch 950/1000
81/81 [==============================] - 0s - loss: 0.0228 - val_loss: 0.0227
Epoch 951/1000
81/81 [==============================] - 0s - loss: 0.0226 - val_loss: 0.0225
Epoch 952/1000
81/81 [==============================] - 0s - loss: 0.0227 - val_loss: 0.0225
Epoch 953/1000
81/81 [==============================] - 0s - loss: 0.0226 - val_loss: 0.0224
Epoch 954/1000
81/81 [==============================] - 0s - loss: 0.0224 - val_loss: 0.0223
Epoch 955/1000
81/81 [==============================] - 0s - loss: 0.0224 - val_loss: 0.0223
Epoch 956/1000
81/81 [==============================] - 0s - loss: 0.0223 - val_loss: 0.0222
Epoch 957/1000
81/81 [==============================] - 0s - loss: 0.0222 - val_loss: 0.0221
Epoch 958/1000
81/81 [==============================] - 0s - loss: 0.0222 - val_loss: 0.0221
Epoch 959/1000
81/81 [==============================] - 0s - loss: 0.0221 - val_loss: 0.0220
Epoch 960/1000
81/81 [==============================] - 0s - loss: 0.0221 - val_loss: 0.0221
Epoch 961/1000
81/81 [==============================] - 0s - loss: 0.0222 - val_loss: 0.0221
Epoch 962/1000
81/81 [==============================] - 0s - loss: 0.0221 - val_loss: 0.0220
Epoch 963/1000
81/81 [==============================] - 0s - loss: 0.0220 - val_loss: 0.0219
Epoch 964/1000
81/81 [==============================] - 0s - loss: 0.0220 - val_loss: 0.0218
Epoch 965/1000
81/81 [==============================] - 0s - loss: 0.0218 - val_loss: 0.0216
Epoch 966/1000
81/81 [==============================] - 0s - loss: 0.0217 - val_loss: 0.0216
Epoch 967/1000
81/81 [==============================] - 0s - loss: 0.0216 - val_loss: 0.0216
Epoch 968/1000
81/81 [==============================] - 0s - loss: 0.0216 - val_loss: 0.0215
Epoch 969/1000
81/81 [==============================] - 0s - loss: 0.0216 - val_loss: 0.0214
Epoch 970/1000
81/81 [==============================] - 0s - loss: 0.0215 - val_loss: 0.0214
Epoch 971/1000
81/81 [==============================] - 0s - loss: 0.0216 - val_loss: 0.0214
Epoch 972/1000
81/81 [==============================] - 0s - loss: 0.0215 - val_loss: 0.0213
Epoch 973/1000
81/81 [==============================] - 0s - loss: 0.0213 - val_loss: 0.0213
Epoch 974/1000
81/81 [==============================] - 0s - loss: 0.0213 - val_loss: 0.0212
Epoch 975/1000
81/81 [==============================] - 0s - loss: 0.0213 - val_loss: 0.0211
Epoch 976/1000
81/81 [==============================] - 0s - loss: 0.0212 - val_loss: 0.0211
Epoch 977/1000
81/81 [==============================] - 0s - loss: 0.0211 - val_loss: 0.0210
Epoch 978/1000
81/81 [==============================] - 0s - loss: 0.0211 - val_loss: 0.0210
Epoch 979/1000
81/81 [==============================] - 0s - loss: 0.0211 - val_loss: 0.0210
Epoch 980/1000
81/81 [==============================] - 0s - loss: 0.0211 - val_loss: 0.0210
Epoch 981/1000
81/81 [==============================] - 0s - loss: 0.0210 - val_loss: 0.0209
Epoch 982/1000
81/81 [==============================] - 0s - loss: 0.0209 - val_loss: 0.0208
Epoch 983/1000
81/81 [==============================] - 0s - loss: 0.0210 - val_loss: 0.0209
Epoch 984/1000
81/81 [==============================] - 0s - loss: 0.0209 - val_loss: 0.0208
Epoch 985/1000
81/81 [==============================] - 0s - loss: 0.0209 - val_loss: 0.0207
Epoch 986/1000
81/81 [==============================] - 0s - loss: 0.0207 - val_loss: 0.0207
Epoch 987/1000
81/81 [==============================] - 0s - loss: 0.0207 - val_loss: 0.0209
Epoch 988/1000
81/81 [==============================] - 0s - loss: 0.0209 - val_loss: 0.0209
Epoch 989/1000
81/81 [==============================] - 0s - loss: 0.0208 - val_loss: 0.0206
Epoch 990/1000
81/81 [==============================] - 0s - loss: 0.0207 - val_loss: 0.0206
Epoch 991/1000
81/81 [==============================] - 0s - loss: 0.0207 - val_loss: 0.0207
Epoch 992/1000
81/81 [==============================] - 0s - loss: 0.0208 - val_loss: 0.0206
Epoch 993/1000
81/81 [==============================] - 0s - loss: 0.0205 - val_loss: 0.0205
Epoch 994/1000
81/81 [==============================] - 0s - loss: 0.0205 - val_loss: 0.0208
Epoch 995/1000
81/81 [==============================] - 0s - loss: 0.0209 - val_loss: 0.0209
Epoch 996/1000
81/81 [==============================] - 0s - loss: 0.0209 - val_loss: 0.0206
Epoch 997/1000
81/81 [==============================] - 0s - loss: 0.0206 - val_loss: 0.0203
Epoch 998/1000
81/81 [==============================] - 0s - loss: 0.0203 - val_loss: 0.0204
Epoch 999/1000
81/81 [==============================] - 0s - loss: 0.0204 - val_loss: 0.0205
Epoch 1000/1000
81/81 [==============================] - 0s - loss: 0.0206 - val_loss: 0.0203
      1   2   3   4   5   6   7   8   9
  1   1   2   3   4   5   6   7   8   9
  2   2   4   6   8  10  12  14  16  18
  3   3   6   1  12  15  18  21  24  27
  4   4   8  12  16  20  24  28  32  36
  5   5  10  15  28  17  30  35  40  45
  6   6  12  18  24  30  44  42  48  54
  7   7  14  21  28  35  42  49  56  63
  8   8  16  24  32  40  48  56  64  72
  9   9  18  27  36  45  54  63  72  81


サンプルコード

import numpy as np
import tensorflow as tf
from tensorflow.contrib.keras.python import keras
from sklearn import cross_validation
from tensorflow.contrib.keras.python.keras.models import Sequential
from tensorflow.contrib.keras.python.keras.layers import Dense, Dropout, Embedding, LSTM

def in_encode(i, j):
    k = j * 16 + i
    return np.array([k >> d & 1 for d in range(8)])

def out_encode(i, j):
    k = j * i
    return np.array([k >> d & 1 for d in range(7)])

def decode(p):
    f = 0
    if p[0] > 0.5:
        f += 1
    if p[1] > 0.5:
        f += 2
    if p[2] > 0.5:
        f += 4
    if p[3] > 0.5:
        f += 8
    if p[4] > 0.5:
        f += 16
    if p[5] > 0.5:
        f += 32
    if p[6] > 0.5:
        f += 64
    return f

trX = np.array([in_encode(i, j) for i in range(1, 10) for j in range(1, 10)])
trY = np.array([out_encode(i, j) for i in range(1, 10) for j in range(1, 10)])
model = Sequential()
model.add(Dense(40, activation = 'tanh', input_shape = (8, )))
model.add(Dense(40, activation = 'tanh'))
model.add(Dense(7, activation = 'linear'))
model.compile(loss = 'mean_squared_error', optimizer = 'adam')
history = model.fit(trX, trY, batch_size = 60, epochs = 1000, verbose = 1, validation_data = (trX, trY))
score = model.evaluate(trX, trY, verbose = 0)
p = '    '
j = 1
for i in range(1, 10):
    p += '%3d ' % (i * j)
p += '\n'
for j in range(1, 10):
    p += '%3d ' % (j)
    for i in range(1, 10):
        x = np.array([in_encode(i, j)])
        pred = model.predict([x])
        k = decode(pred[0])
        p += '%3d ' % (k)
    p += '\n'
print (p)

以上。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0