1
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

TensorFlow / ADDA > 線形方程式の初期値用データの学習 > 学習コード:v0.1, v0.2, 結果確認:v0.1

Last updated at Posted at 2017-04-28
動作環境
GeForce GTX 1070 (8GB)
ASRock Z170M Pro4S [Intel Z170chipset]
Ubuntu 14.04 LTS desktop amd64
TensorFlow v0.11
cuDNN v5.1 for Linux
CUDA v8.0
Python 2.7.6
IPython 5.1.0 -- An enhanced Interactive Python.
gcc (Ubuntu 4.8.4-2ubuntu1~14.04.3) 4.8.4
GNU bash, version 4.3.8(1)-release (x86_64-pc-linux-gnu)

概要

This article is related to ADDA (light scattering simulator based on the discrete dipole approximation).

ADDAの計算で重要となるのが、X,Y,Z方向の電場の値。ランダムな初期値を用いると計算が遅く、最終の解に近い初期値を用いると計算が早くなることは経験済。

supercomputerで計算した最終解を元にDeep learningで学習を行い、その結果を通常のPCで用いる。そうすることで、通常のPC上での計算を高速化し、Communityとしての計算資源の効率利用を目論んでいる。

X,Y,Z方向の電場の値のうち、X方向の実部の値をTensorFlowで学習させてみた。

code

ネットワーク形態

  • 入力層 (3 nodes)
    • x,y,z : dipole の位置情報
  • 中間層 (3層: 100, 100, 100 nodes)
  • 出力層 (1 node)
    • Exr : X方向の電場の実部

学習対象データ input.csv 生成

ADDAを作って、以下のパラメータで計算した。

Generated by ADDA v.1.3b6
command: './adda -grid 25 -orient 0 90 0 -store_int_field -maxiter 1000 '
WARNING: (../make_particle.c:1901) boxX has been adjusted from 25 to 26. Size along X-axis in the shape description is the size of new (adjusted) computational grid.
lambda: 6.283185307
shape: sphere; diameter:10.94003171
box dimensions: 26x26x26
refractive index: 1.5+0i
Dipoles/lambda: 15
        (Volume correction used)
Required relative residual norm: 1e-05
Total number of occupied dipoles: 9328
Volume-equivalent size parameter: 5.470015857

それを下記のコードを用いて、TensorFlow用の入力ファイル(こちらで決めた形式)に変換した。
ADDA > convertToInputcsv_170422.py > TensorFlow用のファイルに変換 > v0.1

学習コード v0.1, v0.2

以下のコードで学習をさせた。

learnExr_170422.py
#!/usr/bin/env python
# -*- coding: utf-8 -*-

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import sys
import tensorflow as tf
import tensorflow.contrib.slim as slim
import numpy as np

'''
v0.2 Apr. 29, 2017
  - save to [model_variables_170429.npy]
  - learn [Exr] only, instead of [Exr, Exi]
v0.1 Apr. 23, 2017
  - change [NUM_EXAMPLES_PER_EPOCH_FOR_TRAIN] from [100] to [9328]
  - change input layer's node from [2] to [3]
  - [input.csv] has 9 columns
=== branched from [learn_xxyyfunc_170321.py] to [learnExr_170422.py] ===
v0.5 Apr. 01, 2017
  - change network from [7,7,7] to [100, 100, 100]
v0.4 Mar. 31, 2017
  - calculate [capacity] from [min_queue_examples] and [batch_size]
v0.3 Mar. 24, 2017
  - change [capacity] from 100 to 40
v0.2 Mar. 24, 2017
  - change [capacity] from 40 to 100
  - output [model_variables] after training
v0.1 Mar. 22, 2017
  - learn mapping of R^2 input to R^2 output
     + using data prepared by [prep_data_170321.py]
  - branched from sine curve learning at
    http://qiita.com/7of9/items/ce58e66b040a0795b2ae
'''

# codingrule:PEP8


filename_queue = tf.train.string_input_producer(["input.csv"])

# prase CSV
reader = tf.TextLineReader()
key, value = reader.read(filename_queue)
def_rec = [[0.], [0.], [0.], [0.], [0.], [0.], [0.], [0.], [0.]]
wrk = tf.decode_csv(value, record_defaults=def_rec)
xpos, ypos, zpos, Exr, Exi, dmy1, dmy1, dmy1, dmy1 = wrk
inputs = tf.pack([xpos, ypos, zpos])
output = tf.pack([Exr])

batch_size = 4  # [4]
# Ref: cifar10_input.py
min_fraction_of_examples_in_queue = 0.2  # 0.4
NUM_EXAMPLES_PER_EPOCH_FOR_TRAIN = 9328
min_queue_examples = int(NUM_EXAMPLES_PER_EPOCH_FOR_TRAIN *
                         min_fraction_of_examples_in_queue)
#
inputs_batch, output_batch = tf.train.shuffle_batch(
    [inputs, output], batch_size, capacity=min_queue_examples + 3 * batch_size,
    min_after_dequeue=batch_size)

input_ph = tf.placeholder("float", [None, 3])
output_ph = tf.placeholder("float", [None, 1])

## network
hiddens = slim.stack(input_ph, slim.fully_connected, [100, 100, 100],
                     activation_fn=tf.nn.sigmoid, scope="hidden")
prediction = slim.fully_connected(
    hiddens, 1, activation_fn=None, scope="output")
loss = tf.contrib.losses.mean_squared_error(prediction, output_ph)

train_op = slim.learning.create_train_op(loss, tf.train.AdamOptimizer(0.001))

init_op = tf.initialize_all_variables()

with tf.Session() as sess:
    coord = tf.train.Coordinator()
    threads = tf.train.start_queue_runners(sess=sess, coord=coord)

    try:
        sess.run(init_op)
        for i in range(90000):  # 30000
            inpbt, outbt = sess.run([inputs_batch, output_batch])
            _, t_loss = sess.run([train_op, loss],
                                 feed_dict={input_ph: inpbt, output_ph: outbt})

            if (i+1) % 100 == 0:
                print("%d,%f" % (i+1, t_loss))
                sys.stdout.flush()

    finally:
        coord.request_stop()

    # output the model
    model_variables = slim.get_model_variables()
    res = sess.run(model_variables)
    np.save('model_variables_170429.npy', res)

    coord.join(threads)

lossの経過

Jupyter Code.

check_result_170423.ipynb
%matplotlib inline

# learning [Exr] from ADDA
# Apr. 29, 2017

import numpy as np
import matplotlib.pyplot as plt

#data1 = np.loadtxt('res.learn_170423', delimiter=',')
data1 = np.loadtxt('res.learn_170429', delimiter=',')

input1 = data1[:,0]
output1 = data1[:,1]

fig = plt.figure()
ax1 = fig.add_subplot(2,1,1)

ax1.plot(input1, output1)

ax1.set_xlabel('step')
ax1.set_ylabel('loss')
ax1.set_ylim([0,0.05])
ax1.grid(True)

fig.show()

qiita.png

lossは順調に減少している。

結果の確認

内容

学習したネットワークを使ってdipole座標(x,y,z)を与えた時のExrを確認する。

学習対象と学習結果を比較しやすくするため、input.csvと同じ形式のファイルを用意する。
input.csvの形式でExrの部分は「学習したExr」に変更したファイルを作る。

TensorFlow / ADDA > reproduce_170429.py > v0.1 > 学習したネットワークをもとに、入力データを与えてpredictionに相当するものを出力
に用意したコードを用いた。

  • 入力: input.csv
  • 出力: input_Exr_replaced_170429.csv

ファイルはADDAを用いて生成できるが、実行環境がない場合向けにdropboxに置いておく。
input.csv
input_Exr_replaced_170429.csv

結果の可視化 > Jupyter code v0.1

学習対象の確認時

$ ln -fs input.csv LN-INPUT-CSV 

学習結果の確認時

$ ln -fs input_Exr_replaced_170429.csv LN-INPUT-CSV 

可視化のためのJupyter Code.

check_resultmap_170429.ipynb
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
import math
import sys
import matplotlib.pyplot as plt
import matplotlib.cm as cm

'''
v0.1 Apr. 29, 2017
   - use snippet from [check_resultmap_170329.ipynb] v0.2
'''
# codingrule:PEP8

src_data = np.genfromtxt('LN-INPUT-CSV', delimiter=',')

xpos, ypos, zpos = src_data[:, 0], src_data[:, 1], src_data[:, 2]
Exr = src_data[:, 3]

# get middle value for zpos
wrk = np.unique(zpos)
if len(wrk) % 2 == 0:
    # let the number of items [even]
    wrk = np.delete(wrk, max(wrk))
pickUpZvalue = np.median(wrk)

# parameters
SIZE_MAP_X = 30  # size of the image
SIZE_MAP_Y = 30
ZOOM_X = 15.0  #
ZOOM_Y = 15.0
SHIFT_X = 15.0  # to shift the center position
SHIFT_Y = 15.0
rmap = [[0.0 for yi in range(SIZE_MAP_Y)] for xi in range(SIZE_MAP_X)]

# prepare map
for aline in zip(xpos, ypos, zpos, Exr):
    ax, ay, az = aline[0], aline[1], aline[2]
    aExr = aline[3]
    #print(aline[2])
    if abs(az - pickUpZvalue) > sys.float_info.epsilon:
        continue

    xidx = (SIZE_MAP_X * ax / ZOOM_X + SHIFT_X).astype(int)
    yidx = (SIZE_MAP_Y * ay / ZOOM_Y + SHIFT_Y).astype(int)

    #print(xidx, yidx)

    if xidx < 0 or xidx >= SIZE_MAP_X:
        continue
    if yidx < 0 or yidx >= SIZE_MAP_Y:
        continue

    rmap[xidx][yidx] = aExr  # overwrite
    #print(ax, ay, az, aExr)

# draw map
wrkarr = np.array(rmap)
figmap = np.reshape(wrkarr, (SIZE_MAP_X, SIZE_MAP_Y))
plt.imshow(figmap, extent=(0, SIZE_MAP_X, 0, SIZE_MAP_Y), cmap=cm.jet)
plt.show()

学習対象

qiita.png

学習結果

qiita.png

良すぎる。
What's the catch?

関連

TensorFlow > mapping > R^2のinputをR^2のoutputにmappingする実装 v0.2 > 学習したネットワークから(y_1, y_2)を再現

TODO

  • Exi, Eyr, Eyi, Ezr, Eziの学習
  • 学習した結果を元に、線形方程式を解きなおす
  • Tmatrix, SHmatrixに相当する概念を検討する
    • どのパラメータを補間するのが最適か?
1
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?