LoginSignup
1
0

dockerで機械学習(33) with anaconda(33)「Deep Learning with Theano 」By Christopher Bourez

Last updated at Posted at 2018-10-23

1.すぐに利用したい方へ(as soon as)

「Deep Learning with Theano 」By Christopher Bourez

cat33.gif

docker

dockerを導入し、Windows, Macではdockerを起動しておいてください。
Windowsでは、BiosでIntel Virtualizationをenableにしないとdockerが起動しない場合があります。
また、セキュリティの警告などが出ることがあります。

docker pull and run

$ docker pull kaizenjapan/anaconda-christopher

$  docker run -it -p 8888:8888 kaizenjapan/anaconda-christopher /bin/bash

以下のshell sessionでは
(base) root@f19e2f06eabb:/#は入力促進記号(comman prompt)です。実際には数字の部分が違うかもしれません。この行の#の右側を入力してください。
それ以外の行は出力です。出力にエラー、違いがあれば、コメント欄などでご連絡くださると幸いです。
それぞれの章のフォルダに移動します。

dockerの中と、dockerを起動したOSのシェルとが表示が似ている場合には、どちらで捜査しているか間違えることがあります。dockerの入力促進記号(comman prompt)に気をつけてください。

ファイル共有または複写

dockerとdockerを起動したOSでは、ファイル共有をするか、ファイル複写するかして、生成したファイルをブラウザ等表示させてください。参考文献欄にやり方のURLを記載しています。

複写の場合は、dockerを起動したOS側コマンドを実行しました。お使いのdockerの番号で置き換えてください。複写したファイルをブラウザで表示し内容確認しました。

1-simple.py

(base) root@a221771835f7:/# cd Deep-Learning-with-Theano/
(base) root@a221771835f7:/Deep-Learning-with-Theano# ls
Chapter02  Chapter03  Chapter04  Chapter05  Chapter06  Chapter10  Chapter11  Chapter12	Chapter13  LICENSE  README.md
(base) root@a221771835f7:/Deep-Learning-with-Theano# cd Chapter02
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# ls
1-simple.py  2-multi.py  3-cnn.py  4-plot.py  5-cnn-with-dropout.py  6-display-activation-functions.py	README.md
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# python 1-simple.py 
Using device cpu
Loading data
Traceback (most recent call last):
  File "1-simple.py", line 13, in <module>
    with gzip.open(data_dir + "mnist.pkl.gz", 'rb') as f:
  File "/opt/conda/lib/python3.6/gzip.py", line 53, in open
    binary_file = GzipFile(filename, gz_mode, compresslevel)
  File "/opt/conda/lib/python3.6/gzip.py", line 163, in __init__
    fileobj = self.myfileobj = builtins.open(filename, mode or 'rb')
FileNotFoundError: [Errno 2] No such file or directory: '/sharedfiles/mnist.pkl.gz'

(base) root@a221771835f7:/# mkdir sharedfiles
(base) root@a221771835f7:/# cd sharedfiles/
(base) root@a221771835f7:/sharedfiles# wget https://github.com/mnielsen/neural-networks-and-deep-learning/blob/master/data/mnist.pkl.gz
--2018-10-23 10:42:27--  https://github.com/mnielsen/neural-networks-and-deep-learning/blob/master/data/mnist.pkl.gz
Resolving github.com (github.com)... 192.30.255.113, 192.30.255.112
Connecting to github.com (github.com)|192.30.255.113|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘mnist.pkl.gz’

mnist.pkl.gz                        [  <=>                                                   ]  42.39K   178KB/s    in 0.2s    

2018-10-23 10:42:28 (178 KB/s) - ‘mnist.pkl.gz’ saved [43403]
(base) root@a221771835f7:/# cd /Deep-Learning-with-Theano
(base) root@a221771835f7:/Deep-Learning-with-Theano# cd Chapter02
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# python 1-simple.py 
Using device cpu
Loading data
Traceback (most recent call last):
  File "1-simple.py", line 14, in <module>
    train_set, valid_set, test_set = pickle.load(f)
  File "/opt/conda/lib/python3.6/gzip.py", line 296, in peek
    return self._buffer.peek(n)
  File "/opt/conda/lib/python3.6/_compression.py", line 68, in readinto
    data = self.read(len(byte_view))
  File "/opt/conda/lib/python3.6/gzip.py", line 463, in read
    if not self._read_gzip_header():
  File "/opt/conda/lib/python3.6/gzip.py", line 411, in _read_gzip_header
    raise OSError('Not a gzipped file (%r)' % magic)
OSError: Not a gzipped file (b'\n\n')

4-plot.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# python 4-plot.py 
Traceback (most recent call last):
  File "4-plot.py", line 5, in <module>
    curves[0] = { 'data' : numpy.load("simple_valid_loss.npy"), 'name' : "simple"}
  File "/opt/conda/lib/python3.6/site-packages/numpy/lib/npyio.py", line 384, in load
    fid = open(file, "rb")
FileNotFoundError: [Errno 2] No such file or directory: 'simple_valid_loss.npy'

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# wget https://git.rcc.uchicago.edu/ivy2/DL_Theano/blob/d664855aa9be2761bb5e8bf346c19b1671bd9e1b/labs/8/simple_valid_loss.npy
--2018-10-23 10:48:22--  https://git.rcc.uchicago.edu/ivy2/DL_Theano/blob/d664855aa9be2761bb5e8bf346c19b1671bd9e1b/labs/8/simple_valid_loss.npy
Resolving git.rcc.uchicago.edu (git.rcc.uchicago.edu)... 128.135.112.102
Connecting to git.rcc.uchicago.edu (git.rcc.uchicago.edu)|128.135.112.102|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘simple_valid_loss.npy’

simple_valid_loss.npy               [ <=>                                                    ]  27.75K  --.-KB/s    in 0.002s  

2018-10-23 10:48:23 (15.7 MB/s) - ‘simple_valid_loss.npy’ saved [28411]


(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# wget https://git.rcc.uchicago.edu/ivy2/DL_Theano/blob/d664855aa9be2761bb5e8bf346c19b1671bd9e1b/labs/9/mlp_valid_loss.npy
--2018-10-23 10:49:20--  https://git.rcc.uchicago.edu/ivy2/DL_Theano/blob/d664855aa9be2761bb5e8bf346c19b1671bd9e1b/labs/9/mlp_valid_loss.npy
Resolving git.rcc.uchicago.edu (git.rcc.uchicago.edu)... 128.135.112.102
Connecting to git.rcc.uchicago.edu (git.rcc.uchicago.edu)|128.135.112.102|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
Saving to: ‘mlp_valid_loss.npy’

mlp_valid_loss.npy                  [ <=>                                                    ]  27.69K  --.-KB/s    in 0.001s  

2018-10-23 10:49:21 (38.2 MB/s) - ‘mlp_valid_loss.npy’ saved [28354]

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# python 4-plot.py 
Traceback (most recent call last):
  File "/opt/conda/lib/python3.6/site-packages/numpy/lib/npyio.py", line 440, in load
    return pickle.load(fid, **pickle_kwargs)
_pickle.UnpicklingError: invalid load key, '<'.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "4-plot.py", line 5, in <module>
    curves[0] = { 'data' : numpy.load("simple_valid_loss.npy"), 'name' : "simple"}
  File "/opt/conda/lib/python3.6/site-packages/numpy/lib/npyio.py", line 443, in load
    "Failed to interpret file %s as a pickle" % repr(file))
OSError: Failed to interpret file 'simple_valid_loss.npy' as a pickle

simple.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# python 6-display-activation-functions.py 
Compiling
Display
length 12
[array([0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.05, 0.1 , 0.15, 0.2 , 0.25, 0.3 , 0.35, 0.4 , 0.45, 0.5 , 0.55,
       0.6 , 0.65, 0.7 , 0.75, 0.8 , 0.85, 0.9 , 0.95, 1.  , 1.05, 1.1 ,
       1.15, 1.2 , 1.25, 1.3 , 1.35, 1.4 , 1.45, 1.5 , 1.55, 1.6 , 1.65,
       1.7 , 1.75, 1.8 , 1.85, 1.9 , 1.95, 2.  , 2.05, 2.1 , 2.15, 2.2 ,
       2.25, 2.3 , 2.35, 2.4 , 2.45, 2.5 , 2.55, 2.6 , 2.65, 2.7 , 2.75,
       2.8 , 2.85, 2.9 , 2.95, 3.  , 3.05, 3.1 , 3.15, 3.2 , 3.25, 3.3 ,
       3.35, 3.4 , 3.45, 3.5 , 3.55, 3.6 , 3.65, 3.7 , 3.75, 3.8 , 3.85,
       3.9 , 3.95, 4.  , 4.05, 4.1 , 4.15, 4.2 , 4.25, 4.3 , 4.35, 4.4 ,
       4.45, 4.5 , 4.55, 4.6 , 4.65, 4.7 , 4.75, 4.8 , 4.85, 4.9 , 4.95,
       5.  , 5.05, 5.1 , 5.15, 5.2 , 5.25, 5.3 , 5.35, 5.4 , 5.45, 5.5 ,
       5.55, 5.6 , 5.65, 5.7 , 5.75, 5.8 , 5.85, 5.9 , 5.95]), array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1.]), array([-1.80000000e-01, -1.78500000e-01, -1.77000000e-01, -1.75500000e-01,
       -1.74000000e-01, -1.72500000e-01, -1.71000000e-01, -1.69500000e-01,
       -1.68000000e-01, -1.66500000e-01, -1.65000000e-01, -1.63500000e-01,
       -1.62000000e-01, -1.60500000e-01, -1.59000000e-01, -1.57500000e-01,
       -1.56000000e-01, -1.54500000e-01, -1.53000000e-01, -1.51500000e-01,
       -1.50000000e-01, -1.48500000e-01, -1.47000000e-01, -1.45500000e-01,
       -1.44000000e-01, -1.42500000e-01, -1.41000000e-01, -1.39500000e-01,
       -1.38000000e-01, -1.36500000e-01, -1.35000000e-01, -1.33500000e-01,
       -1.32000000e-01, -1.30500000e-01, -1.29000000e-01, -1.27500000e-01,
       -1.26000000e-01, -1.24500000e-01, -1.23000000e-01, -1.21500000e-01,
       -1.20000000e-01, -1.18500000e-01, -1.17000000e-01, -1.15500000e-01,
       -1.14000000e-01, -1.12500000e-01, -1.11000000e-01, -1.09500000e-01,
       -1.08000000e-01, -1.06500000e-01, -1.05000000e-01, -1.03500000e-01,
       -1.02000000e-01, -1.00500000e-01, -9.90000000e-02, -9.75000000e-02,
       -9.60000000e-02, -9.45000000e-02, -9.30000000e-02, -9.15000000e-02,
       -9.00000000e-02, -8.85000000e-02, -8.70000000e-02, -8.55000000e-02,
       -8.40000000e-02, -8.25000000e-02, -8.10000000e-02, -7.95000000e-02,
       -7.80000000e-02, -7.65000000e-02, -7.50000000e-02, -7.35000000e-02,
       -7.20000000e-02, -7.05000000e-02, -6.90000000e-02, -6.75000000e-02,
       -6.60000000e-02, -6.45000000e-02, -6.30000000e-02, -6.15000000e-02,
       -6.00000000e-02, -5.85000000e-02, -5.70000000e-02, -5.55000000e-02,
       -5.40000000e-02, -5.25000000e-02, -5.10000000e-02, -4.95000000e-02,
       -4.80000000e-02, -4.65000000e-02, -4.50000000e-02, -4.35000000e-02,
       -4.20000000e-02, -4.05000000e-02, -3.90000000e-02, -3.75000000e-02,
       -3.60000000e-02, -3.45000000e-02, -3.30000000e-02, -3.15000000e-02,
       -3.00000000e-02, -2.85000000e-02, -2.70000000e-02, -2.55000000e-02,
       -2.40000000e-02, -2.25000000e-02, -2.10000000e-02, -1.95000000e-02,
       -1.80000000e-02, -1.65000000e-02, -1.50000000e-02, -1.35000000e-02,
       -1.20000000e-02, -1.05000000e-02, -9.00000000e-03, -7.50000000e-03,
       -6.00000000e-03, -4.50000000e-03, -3.00000000e-03, -1.50000000e-03,
       -6.39488462e-16,  5.00000000e-02,  1.00000000e-01,  1.50000000e-01,
        2.00000000e-01,  2.50000000e-01,  3.00000000e-01,  3.50000000e-01,
        4.00000000e-01,  4.50000000e-01,  5.00000000e-01,  5.50000000e-01,
        6.00000000e-01,  6.50000000e-01,  7.00000000e-01,  7.50000000e-01,
        8.00000000e-01,  8.50000000e-01,  9.00000000e-01,  9.50000000e-01,
        1.00000000e+00,  1.05000000e+00,  1.10000000e+00,  1.15000000e+00,
        1.20000000e+00,  1.25000000e+00,  1.30000000e+00,  1.35000000e+00,
        1.40000000e+00,  1.45000000e+00,  1.50000000e+00,  1.55000000e+00,
        1.60000000e+00,  1.65000000e+00,  1.70000000e+00,  1.75000000e+00,
        1.80000000e+00,  1.85000000e+00,  1.90000000e+00,  1.95000000e+00,
        2.00000000e+00,  2.05000000e+00,  2.10000000e+00,  2.15000000e+00,
        2.20000000e+00,  2.25000000e+00,  2.30000000e+00,  2.35000000e+00,
        2.40000000e+00,  2.45000000e+00,  2.50000000e+00,  2.55000000e+00,
        2.60000000e+00,  2.65000000e+00,  2.70000000e+00,  2.75000000e+00,
        2.80000000e+00,  2.85000000e+00,  2.90000000e+00,  2.95000000e+00,
        3.00000000e+00,  3.05000000e+00,  3.10000000e+00,  3.15000000e+00,
        3.20000000e+00,  3.25000000e+00,  3.30000000e+00,  3.35000000e+00,
        3.40000000e+00,  3.45000000e+00,  3.50000000e+00,  3.55000000e+00,
        3.60000000e+00,  3.65000000e+00,  3.70000000e+00,  3.75000000e+00,
        3.80000000e+00,  3.85000000e+00,  3.90000000e+00,  3.95000000e+00,
        4.00000000e+00,  4.05000000e+00,  4.10000000e+00,  4.15000000e+00,
        4.20000000e+00,  4.25000000e+00,  4.30000000e+00,  4.35000000e+00,
        4.40000000e+00,  4.45000000e+00,  4.50000000e+00,  4.55000000e+00,
        4.60000000e+00,  4.65000000e+00,  4.70000000e+00,  4.75000000e+00,
        4.80000000e+00,  4.85000000e+00,  4.90000000e+00,  4.95000000e+00,
        5.00000000e+00,  5.05000000e+00,  5.10000000e+00,  5.15000000e+00,
        5.20000000e+00,  5.25000000e+00,  5.30000000e+00,  5.35000000e+00,
        5.40000000e+00,  5.45000000e+00,  5.50000000e+00,  5.55000000e+00,
        5.60000000e+00,  5.65000000e+00,  5.70000000e+00,  5.75000000e+00,
        5.80000000e+00,  5.85000000e+00,  5.90000000e+00,  5.95000000e+00]), array([0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03, 0.03,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ]), array([0.00247262, 0.00259907, 0.00273196, 0.00287163, 0.00301842,
       0.00317268, 0.00333481, 0.00350519, 0.00368424, 0.0038724 ,
       0.00407014, 0.00427793, 0.00449627, 0.00472571, 0.0049668 ,
       0.00522013, 0.0054863 , 0.00576597, 0.0060598 , 0.00636852,
       0.00669285, 0.00703359, 0.00739154, 0.00776757, 0.00816257,
       0.00857749, 0.0090133 , 0.00947104, 0.0099518 , 0.01045671,
       0.01098694, 0.01154375, 0.01212843, 0.01274235, 0.01338692,
       0.01406363, 0.01477403, 0.01551976, 0.0163025 , 0.01712403,
       0.01798621, 0.01889096, 0.01984031, 0.02083634, 0.02188127,
       0.02297737, 0.02412702, 0.0253327 , 0.02659699, 0.02792257,
       0.02931223, 0.03076886, 0.03229546, 0.03389516, 0.03557119,
       0.03732689, 0.03916572, 0.04109128, 0.04310725, 0.04521747,
       0.04742587, 0.04973651, 0.05215356, 0.05468132, 0.05732418,
       0.06008665, 0.06297336, 0.06598901, 0.06913842, 0.07242649,
       0.07585818, 0.07943855, 0.0831727 , 0.08706577, 0.09112296,
       0.09534946, 0.09975049, 0.10433122, 0.10909682, 0.11405238,
       0.11920292, 0.12455336, 0.13010847, 0.1358729 , 0.14185106,
       0.1480472 , 0.15446527, 0.16110895, 0.16798161, 0.17508627,
       0.18242552, 0.19000157, 0.19781611, 0.20587037, 0.21416502,
       0.22270014, 0.23147522, 0.24048908, 0.24973989, 0.2592251 ,
       0.26894142, 0.27888482, 0.2890505 , 0.29943286, 0.31002552,
       0.3208213 , 0.33181223, 0.34298954, 0.35434369, 0.36586441,
       0.37754067, 0.38936077, 0.40131234, 0.41338242, 0.42555748,
       0.4378235 , 0.450166  , 0.46257015, 0.47502081, 0.4875026 ,
       0.5       , 0.5124974 , 0.52497919, 0.53742985, 0.549834  ,
       0.5621765 , 0.57444252, 0.58661758, 0.59868766, 0.61063923,
       0.62245933, 0.63413559, 0.64565631, 0.65701046, 0.66818777,
       0.6791787 , 0.68997448, 0.70056714, 0.7109495 , 0.72111518,
       0.73105858, 0.7407749 , 0.75026011, 0.75951092, 0.76852478,
       0.77729986, 0.78583498, 0.79412963, 0.80218389, 0.80999843,
       0.81757448, 0.82491373, 0.83201839, 0.83889105, 0.84553473,
       0.8519528 , 0.85814894, 0.8641271 , 0.86989153, 0.87544664,
       0.88079708, 0.88594762, 0.89090318, 0.89566878, 0.90024951,
       0.90465054, 0.90887704, 0.91293423, 0.9168273 , 0.92056145,
       0.92414182, 0.92757351, 0.93086158, 0.93401099, 0.93702664,
       0.93991335, 0.94267582, 0.94531868, 0.94784644, 0.95026349,
       0.95257413, 0.95478253, 0.95689275, 0.95890872, 0.96083428,
       0.96267311, 0.96442881, 0.96610484, 0.96770454, 0.96923114,
       0.97068777, 0.97207743, 0.97340301, 0.9746673 , 0.97587298,
       0.97702263, 0.97811873, 0.97916366, 0.98015969, 0.98110904,
       0.98201379, 0.98287597, 0.9836975 , 0.98448024, 0.98522597,
       0.98593637, 0.98661308, 0.98725765, 0.98787157, 0.98845625,
       0.98901306, 0.98954329, 0.9900482 , 0.99052896, 0.9909867 ,
       0.99142251, 0.99183743, 0.99223243, 0.99260846, 0.99296641,
       0.99330715, 0.99363148, 0.9939402 , 0.99423403, 0.9945137 ,
       0.99477987, 0.9950332 , 0.99527429, 0.99550373, 0.99572207,
       0.99592986, 0.9961276 , 0.99631576, 0.99649481, 0.99666519,
       0.99682732, 0.99698158, 0.99712837, 0.99726804, 0.99740093]), array([0.00246651, 0.00259231, 0.0027245 , 0.00286338, 0.00300931,
       0.00316262, 0.00332369, 0.0034929 , 0.00367067, 0.00385741,
       0.00405357, 0.00425962, 0.00447606, 0.00470338, 0.00494213,
       0.00519288, 0.0054562 , 0.00573272, 0.00602308, 0.00632796,
       0.00664806, 0.00698412, 0.00733691, 0.00770723, 0.00809594,
       0.00850391, 0.00893206, 0.00938134, 0.00985276, 0.01034736,
       0.01086623, 0.01141049, 0.01198134, 0.01257998, 0.01320771,
       0.01386584, 0.01455576, 0.01527889, 0.01603673, 0.0168308 ,
       0.01766271, 0.01853409, 0.01944667, 0.02040219, 0.02140248,
       0.02244941, 0.02354491, 0.02469096, 0.02588959, 0.0271429 ,
       0.02845302, 0.02982214, 0.03125247, 0.03274628, 0.03430588,
       0.03593359, 0.03763177, 0.03940279, 0.04124902, 0.04317285,
       0.04517666, 0.04726279, 0.04943357, 0.05169127, 0.05403811,
       0.05647624, 0.05900771, 0.06163446, 0.0643583 , 0.06718089,
       0.07010372, 0.07312807, 0.076255  , 0.07948532, 0.08281957,
       0.08625794, 0.08980033, 0.09344622, 0.0971947 , 0.10104444,
       0.10499359, 0.10903982, 0.11318026, 0.11741145, 0.12172934,
       0.12612923, 0.13060575, 0.13515286, 0.13976379, 0.14443107,
       0.14914645, 0.15390097, 0.1586849 , 0.16348776, 0.16829836,
       0.17310479, 0.17789444, 0.18265408, 0.18736988, 0.19202745,
       0.19661193, 0.20110808, 0.20550031, 0.20977282, 0.2139097 ,
       0.21789499, 0.22171287, 0.22534771, 0.22878424, 0.23200764,
       0.23500371, 0.23775896, 0.24026075, 0.2424974 , 0.24445831,
       0.24613408, 0.24751657, 0.24859901, 0.24937604, 0.24984382,
       0.25      , 0.24984382, 0.24937604, 0.24859901, 0.24751657,
       0.24613408, 0.24445831, 0.2424974 , 0.24026075, 0.23775896,
       0.23500371, 0.23200764, 0.22878424, 0.22534771, 0.22171287,
       0.21789499, 0.2139097 , 0.20977282, 0.20550031, 0.20110808,
       0.19661193, 0.19202745, 0.18736988, 0.18265408, 0.17789444,
       0.17310479, 0.16829836, 0.16348776, 0.1586849 , 0.15390097,
       0.14914645, 0.14443107, 0.13976379, 0.13515286, 0.13060575,
       0.12612923, 0.12172934, 0.11741145, 0.11318026, 0.10903982,
       0.10499359, 0.10104444, 0.0971947 , 0.09344622, 0.08980033,
       0.08625794, 0.08281957, 0.07948532, 0.076255  , 0.07312807,
       0.07010372, 0.06718089, 0.0643583 , 0.06163446, 0.05900771,
       0.05647624, 0.05403811, 0.05169127, 0.04943357, 0.04726279,
       0.04517666, 0.04317285, 0.04124902, 0.03940279, 0.03763177,
       0.03593359, 0.03430588, 0.03274628, 0.03125247, 0.02982214,
       0.02845302, 0.0271429 , 0.02588959, 0.02469096, 0.02354491,
       0.02244941, 0.02140248, 0.02040219, 0.01944667, 0.01853409,
       0.01766271, 0.0168308 , 0.01603673, 0.01527889, 0.01455576,
       0.01386584, 0.01320771, 0.01257998, 0.01198134, 0.01141049,
       0.01086623, 0.01034736, 0.00985276, 0.00938134, 0.00893206,
       0.00850391, 0.00809594, 0.00770723, 0.00733691, 0.00698412,
       0.00664806, 0.00632796, 0.00602308, 0.00573272, 0.0054562 ,
       0.00519288, 0.00494213, 0.00470338, 0.00447606, 0.00425962,
       0.00405357, 0.00385741, 0.00367067, 0.0034929 , 0.00332369,
       0.00316262, 0.00300931, 0.00286338, 0.0027245 , 0.00259231]), array([0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  , 0.  ,
       0.  , 0.05, 0.1 , 0.15, 0.2 , 0.25, 0.3 , 0.35, 0.4 , 0.45, 0.5 ,
       0.55, 0.6 , 0.65, 0.7 , 0.75, 0.8 , 0.85, 0.9 , 0.95, 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ,
       1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  , 1.  ]), array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0.]), array([-1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -1.00000000e+00, -1.00000000e+00, -1.00000000e+00,
       -1.00000000e+00, -9.50000000e-01, -9.00000000e-01, -8.50000000e-01,
       -8.00000000e-01, -7.50000000e-01, -7.00000000e-01, -6.50000000e-01,
       -6.00000000e-01, -5.50000000e-01, -5.00000000e-01, -4.50000000e-01,
       -4.00000000e-01, -3.50000000e-01, -3.00000000e-01, -2.50000000e-01,
       -2.00000000e-01, -1.50000000e-01, -1.00000000e-01, -5.00000000e-02,
       -2.13162821e-14,  5.00000000e-02,  1.00000000e-01,  1.50000000e-01,
        2.00000000e-01,  2.50000000e-01,  3.00000000e-01,  3.50000000e-01,
        4.00000000e-01,  4.50000000e-01,  5.00000000e-01,  5.50000000e-01,
        6.00000000e-01,  6.50000000e-01,  7.00000000e-01,  7.50000000e-01,
        8.00000000e-01,  8.50000000e-01,  9.00000000e-01,  9.50000000e-01,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00,
        1.00000000e+00,  1.00000000e+00,  1.00000000e+00,  1.00000000e+00]), array([0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.,
       0., 0.]), array([-9.99987712e-01, -9.99986419e-01, -9.99984991e-01, -9.99983412e-01,
       -9.99981668e-01, -9.99979740e-01, -9.99977609e-01, -9.99975254e-01,
       -9.99972652e-01, -9.99969776e-01, -9.99966597e-01, -9.99963084e-01,
       -9.99959202e-01, -9.99954911e-01, -9.99950169e-01, -9.99944929e-01,
       -9.99939137e-01, -9.99932736e-01, -9.99925662e-01, -9.99917844e-01,
       -9.99909204e-01, -9.99899656e-01, -9.99889103e-01, -9.99877441e-01,
       -9.99864552e-01, -9.99850308e-01, -9.99834566e-01, -9.99817168e-01,
       -9.99797942e-01, -9.99776693e-01, -9.99753211e-01, -9.99727259e-01,
       -9.99698579e-01, -9.99666884e-01, -9.99631856e-01, -9.99593146e-01,
       -9.99550366e-01, -9.99503090e-01, -9.99450844e-01, -9.99393106e-01,
       -9.99329300e-01, -9.99258788e-01, -9.99180866e-01, -9.99094756e-01,
       -9.98999598e-01, -9.98894443e-01, -9.98778241e-01, -9.98649835e-01,
       -9.98507942e-01, -9.98351151e-01, -9.98177898e-01, -9.97986458e-01,
       -9.97774928e-01, -9.97541203e-01, -9.97282960e-01, -9.96997635e-01,
       -9.96682398e-01, -9.96334122e-01, -9.95949359e-01, -9.95524303e-01,
       -9.95054754e-01, -9.94536078e-01, -9.93963167e-01, -9.93330385e-01,
       -9.92631520e-01, -9.91859725e-01, -9.91007454e-01, -9.90066397e-01,
       -9.89027402e-01, -9.87880397e-01, -9.86614298e-01, -9.85216917e-01,
       -9.83674858e-01, -9.81973403e-01, -9.80096396e-01, -9.78026115e-01,
       -9.75743130e-01, -9.73226164e-01, -9.70451937e-01, -9.67395001e-01,
       -9.64027580e-01, -9.60319389e-01, -9.56237458e-01, -9.51745957e-01,
       -9.46806013e-01, -9.41375538e-01, -9.35409071e-01, -9.28857621e-01,
       -9.21668554e-01, -9.13785490e-01, -9.05148254e-01, -8.95692874e-01,
       -8.85351648e-01, -8.74053288e-01, -8.61723159e-01, -8.48283640e-01,
       -8.33654607e-01, -8.17754078e-01, -8.00499022e-01, -7.81806358e-01,
       -7.61594156e-01, -7.39783051e-01, -7.16297870e-01, -6.91069470e-01,
       -6.64036770e-01, -6.35148952e-01, -6.04367777e-01, -5.71669966e-01,
       -5.37049567e-01, -5.00520211e-01, -4.62117157e-01, -4.21899005e-01,
       -3.79948962e-01, -3.36375544e-01, -2.91312612e-01, -2.44918662e-01,
       -1.97375320e-01, -1.48885034e-01, -9.96679946e-02, -4.99583750e-02,
       -2.13162821e-14,  4.99583750e-02,  9.96679946e-02,  1.48885034e-01,
        1.97375320e-01,  2.44918662e-01,  2.91312612e-01,  3.36375544e-01,
        3.79948962e-01,  4.21899005e-01,  4.62117157e-01,  5.00520211e-01,
        5.37049567e-01,  5.71669966e-01,  6.04367777e-01,  6.35148952e-01,
        6.64036770e-01,  6.91069470e-01,  7.16297870e-01,  7.39783051e-01,
        7.61594156e-01,  7.81806358e-01,  8.00499022e-01,  8.17754078e-01,
        8.33654607e-01,  8.48283640e-01,  8.61723159e-01,  8.74053288e-01,
        8.85351648e-01,  8.95692874e-01,  9.05148254e-01,  9.13785490e-01,
        9.21668554e-01,  9.28857621e-01,  9.35409071e-01,  9.41375538e-01,
        9.46806013e-01,  9.51745957e-01,  9.56237458e-01,  9.60319389e-01,
        9.64027580e-01,  9.67395001e-01,  9.70451937e-01,  9.73226164e-01,
        9.75743130e-01,  9.78026115e-01,  9.80096396e-01,  9.81973403e-01,
        9.83674858e-01,  9.85216917e-01,  9.86614298e-01,  9.87880397e-01,
        9.89027402e-01,  9.90066397e-01,  9.91007454e-01,  9.91859725e-01,
        9.92631520e-01,  9.93330385e-01,  9.93963167e-01,  9.94536078e-01,
        9.95054754e-01,  9.95524303e-01,  9.95949359e-01,  9.96334122e-01,
        9.96682398e-01,  9.96997635e-01,  9.97282960e-01,  9.97541203e-01,
        9.97774928e-01,  9.97986458e-01,  9.98177898e-01,  9.98351151e-01,
        9.98507942e-01,  9.98649835e-01,  9.98778241e-01,  9.98894443e-01,
        9.98999598e-01,  9.99094756e-01,  9.99180866e-01,  9.99258788e-01,
        9.99329300e-01,  9.99393106e-01,  9.99450844e-01,  9.99503090e-01,
        9.99550366e-01,  9.99593146e-01,  9.99631856e-01,  9.99666884e-01,
        9.99698579e-01,  9.99727259e-01,  9.99753211e-01,  9.99776693e-01,
        9.99797942e-01,  9.99817168e-01,  9.99834566e-01,  9.99850308e-01,
        9.99864552e-01,  9.99877441e-01,  9.99889103e-01,  9.99899656e-01,
        9.99909204e-01,  9.99917844e-01,  9.99925662e-01,  9.99932736e-01,
        9.99939137e-01,  9.99944929e-01,  9.99950169e-01,  9.99954911e-01,
        9.99959202e-01,  9.99963084e-01,  9.99966597e-01,  9.99969776e-01,
        9.99972652e-01,  9.99975254e-01,  9.99977609e-01,  9.99979740e-01,
        9.99981668e-01,  9.99983412e-01,  9.99984991e-01,  9.99986419e-01]), array([2.45765474e-05, 2.71612504e-05, 3.00177811e-05, 3.31747264e-05,
       3.66636788e-05, 4.05195535e-05, 4.47809367e-05, 4.94904724e-05,
       5.46952884e-05, 6.04474683e-05, 6.68045716e-05, 7.38302104e-05,
       8.15946846e-05, 9.01756856e-05, 9.96590727e-05, 1.10139732e-04,
       1.21722523e-04, 1.34523332e-04, 1.48670222e-04, 1.64304721e-04,
       1.81583231e-04, 2.00678590e-04, 2.21781801e-04, 2.45103938e-04,
       2.70878252e-04, 2.99362502e-04, 3.30841523e-04, 3.65630068e-04,
       4.04075948e-04, 4.46563497e-04, 4.93517399e-04, 5.45406921e-04,
       6.02750578e-04, 6.66121293e-04, 7.36152090e-04, 8.13542382e-04,
       8.99064911e-04, 9.93573408e-04, 1.09801105e-03, 1.21341980e-03,
       1.34095068e-03, 1.48187517e-03, 1.63759768e-03, 1.80966943e-03,
       1.99980362e-03, 2.20989229e-03, 2.44202474e-03, 2.69850798e-03,
       2.98188910e-03, 3.29498004e-03, 3.64088472e-03, 4.02302893e-03,
       4.44519319e-03, 4.91154880e-03, 5.42669750e-03, 5.99571483e-03,
       6.62419784e-03, 7.31831711e-03, 8.08487387e-03, 8.93136222e-03,
       9.86603717e-03, 1.08979886e-02, 1.20372220e-02, 1.32947455e-02,
       1.46826651e-02, 1.62142868e-02, 1.79042268e-02, 1.97685301e-02,
       2.18247977e-02, 2.40923212e-02, 2.65922267e-02, 2.93476258e-02,
       3.23837743e-02, 3.57282364e-02, 3.94110540e-02, 4.34649189e-02,
       4.79253442e-02, 5.28308330e-02, 5.82230387e-02, 6.41469115e-02,
       7.06508249e-02, 7.77866720e-02, 8.56099237e-02, 9.41796330e-02,
       1.03558374e-01, 1.13812096e-01, 1.25009871e-01, 1.37223519e-01,
       1.50527076e-01, 1.64996078e-01, 1.80706639e-01, 1.97734276e-01,
       2.16152459e-01, 2.36030850e-01, 2.57433197e-01, 2.80414866e-01,
       3.05019996e-01, 3.31278268e-01, 3.59201316e-01, 3.88778819e-01,
       4.19974342e-01, 4.52721037e-01, 4.86917361e-01, 5.22422988e-01,
       5.59055168e-01, 5.96585808e-01, 6.34739590e-01, 6.73193450e-01,
       7.11577763e-01, 7.49479518e-01, 7.86447733e-01, 8.22001229e-01,
       8.55638786e-01, 8.86851493e-01, 9.15136962e-01, 9.40014849e-01,
       9.61042983e-01, 9.77833247e-01, 9.90066291e-01, 9.97504161e-01,
       1.00000000e+00, 9.97504161e-01, 9.90066291e-01, 9.77833247e-01,
       9.61042983e-01, 9.40014849e-01, 9.15136962e-01, 8.86851493e-01,
       8.55638786e-01, 8.22001229e-01, 7.86447733e-01, 7.49479518e-01,
       7.11577763e-01, 6.73193450e-01, 6.34739590e-01, 5.96585808e-01,
       5.59055168e-01, 5.22422988e-01, 4.86917361e-01, 4.52721037e-01,
       4.19974342e-01, 3.88778819e-01, 3.59201316e-01, 3.31278268e-01,
       3.05019996e-01, 2.80414866e-01, 2.57433197e-01, 2.36030850e-01,
       2.16152459e-01, 1.97734276e-01, 1.80706639e-01, 1.64996078e-01,
       1.50527076e-01, 1.37223519e-01, 1.25009871e-01, 1.13812096e-01,
       1.03558374e-01, 9.41796330e-02, 8.56099237e-02, 7.77866720e-02,
       7.06508249e-02, 6.41469115e-02, 5.82230387e-02, 5.28308330e-02,
       4.79253442e-02, 4.34649189e-02, 3.94110540e-02, 3.57282364e-02,
       3.23837743e-02, 2.93476258e-02, 2.65922267e-02, 2.40923212e-02,
       2.18247977e-02, 1.97685301e-02, 1.79042268e-02, 1.62142868e-02,
       1.46826651e-02, 1.32947455e-02, 1.20372220e-02, 1.08979886e-02,
       9.86603717e-03, 8.93136222e-03, 8.08487387e-03, 7.31831711e-03,
       6.62419784e-03, 5.99571483e-03, 5.42669750e-03, 4.91154880e-03,
       4.44519319e-03, 4.02302893e-03, 3.64088472e-03, 3.29498004e-03,
       2.98188910e-03, 2.69850798e-03, 2.44202474e-03, 2.20989229e-03,
       1.99980362e-03, 1.80966943e-03, 1.63759768e-03, 1.48187517e-03,
       1.34095068e-03, 1.21341980e-03, 1.09801105e-03, 9.93573408e-04,
       8.99064911e-04, 8.13542382e-04, 7.36152090e-04, 6.66121293e-04,
       6.02750578e-04, 5.45406921e-04, 4.93517399e-04, 4.46563497e-04,
       4.04075948e-04, 3.65630068e-04, 3.30841523e-04, 2.99362502e-04,
       2.70878252e-04, 2.45103938e-04, 2.21781801e-04, 2.00678590e-04,
       1.81583231e-04, 1.64304721e-04, 1.48670222e-04, 1.34523332e-04,
       1.21722523e-04, 1.10139732e-04, 9.96590727e-05, 9.01756856e-05,
       8.15946846e-05, 7.38302104e-05, 6.68045716e-05, 6.04474683e-05,
       5.46952884e-05, 4.94904724e-05, 4.47809367e-05, 4.05195535e-05,
       3.66636788e-05, 3.31747264e-05, 3.00177811e-05, 2.71612504e-05])]
(240,) (240,)
/opt/conda/lib/python3.6/site-packages/matplotlib/figure.py:448: UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure.
  % get_backend())
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter02# vi 6-display-activation-functions.py 

4 行追加
import matplotlib as mpl
mpl.use('Agg')
fig = plt.figure()
fig.savefig('img.png')
1行plt.show()を注釈に

1-train-CBOW.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter03# python 1-train-CBOW.py 
Traceback (most recent call last):
  File "1-train-CBOW.py", line 165, in <module>
    words = get_words(args.data_file)
  File "1-train-CBOW.py", line 38, in get_words
    with open(fname) as fin:
FileNotFoundError: [Errno 2] No such file or directory: '/sharedfiles/text8'

2-plot.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter03# python 2-plot.py 
Traceback (most recent call last):
  File "2-plot.py", line 8, in <module>
    with open('idx2word.pkl', 'rb') as f:
FileNotFoundError: [Errno 2] No such file or directory: 'idx2word.pkl'

plot.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter04# python plot.py 
Traceback (most recent call last):
  File "plot.py", line 17, in <module>
    data = numpy.load("train_loss_word_" + typ + "_h" + str(h) + "_e30.npy")
  File "/opt/conda/lib/python3.6/site-packages/numpy/lib/npyio.py", line 384, in load
    fid = open(file, "rb")
FileNotFoundError: [Errno 2] No such file or directory: 'train_loss_word_simple_h500_e30.npy'

predict.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter04# python predict.py 
Traceback (most recent call last):
  File "predict.py", line 6, in <module>
    import models
  File "/Deep-Learning-with-Theano/Chapter04/models/__init__.py", line 1, in <module>
    import simple
ModuleNotFoundError: No module named 'simple'
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter04# pip install simple
Collecting simple
  Could not find a version that satisfies the requirement simple (from versions: )
No matching distribution found for simple

train.py

(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter04# python train.py 
Traceback (most recent call last):
  File "train.py", line 8, in <module>
    import models
  File "/Deep-Learning-with-Theano/Chapter04/models/__init__.py", line 1, in <module>
    import simple
ModuleNotFoundError: No module named 'simple'
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter04# cd ../Chapter05
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter05# ls
README.md  bilstm.py  download_tweets.py  sem_eval2103.dev  sem_eval2103.test  sem_eval2103.train
(base) root@a221771835f7:/Deep-Learning-with-Theano/Chapter05# python bilstm.py 
Using TensorFlow backend.
Train size:  (5605, 43)
Dev size:  (598, 43)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
embedding_1 (Embedding)      (None, 43, 100)           987600    
_________________________________________________________________
bidirectional_1 (Bidirection (None, 128)               84480     
_________________________________________________________________
dense_1 (Dense)              (None, 3)                 387       
_________________________________________________________________
activation_1 (Activation)    (None, 3)                 0         
=================================================================
Total params: 1,072,467
Trainable params: 1,072,467
Non-trainable params: 0
_________________________________________________________________
None
Train on 5605 samples, validate on 598 samples
Epoch 1/30
2018-10-23 11:01:15.256878: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2 AVX AVX2 FMA
2018-10-23 11:01:15.257313: I tensorflow/core/common_runtime/process_util.cc:69] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
5605/5605 [==============================] - 64s 11ms/step - loss: 0.7264 - acc: 0.6806 - val_loss: 0.6455 - val_acc: 0.7090
Epoch 2/30
5605/5605 [==============================] - 56s 10ms/step - loss: 0.5864 - acc: 0.7682 - val_loss: 0.6417 - val_acc: 0.7057
Epoch 3/30
5605/5605 [==============================] - 62s 11ms/step - loss: 0.5210 - acc: 0.7995 - val_loss: 0.7114 - val_acc: 0.7241
Epoch 4/30
5605/5605 [==============================] - 62s 11ms/step - loss: 0.4776 - acc: 0.8182 - val_loss: 0.6850 - val_acc: 0.7074
Epoch 5/30
5605/5605 [==============================] - 60s 11ms/step - loss: 0.4406 - acc: 0.8316 - val_loss: 0.6630 - val_acc: 0.7140
Epoch 6/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.4079 - acc: 0.8409 - val_loss: 0.6768 - val_acc: 0.6806
Epoch 7/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.3775 - acc: 0.8505 - val_loss: 0.6932 - val_acc: 0.6839
Epoch 8/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.3515 - acc: 0.8608 - val_loss: 0.7348 - val_acc: 0.6789
Epoch 9/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.3342 - acc: 0.8710 - val_loss: 0.7590 - val_acc: 0.6739
Epoch 10/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.3197 - acc: 0.8721 - val_loss: 0.7194 - val_acc: 0.6656
Epoch 11/30
5605/5605 [==============================] - 60s 11ms/step - loss: 0.3005 - acc: 0.8840 - val_loss: 0.7525 - val_acc: 0.6438
Epoch 12/30
5605/5605 [==============================] - 61s 11ms/step - loss: 0.2944 - acc: 0.8869 - val_loss: 0.7490 - val_acc: 0.6639
Epoch 13/30
5605/5605 [==============================] - 53s 9ms/step - loss: 0.2863 - acc: 0.8864 - val_loss: 0.8113 - val_acc: 0.6120
Epoch 14/30
5605/5605 [==============================] - 66s 12ms/step - loss: 0.2755 - acc: 0.8899 - val_loss: 0.7793 - val_acc: 0.6137
Epoch 15/30
5605/5605 [==============================] - 64s 11ms/step - loss: 0.2695 - acc: 0.8928 - val_loss: 0.8315 - val_acc: 0.5819
Epoch 16/30
5605/5605 [==============================] - 80s 14ms/step - loss: 0.2619 - acc: 0.8926 - val_loss: 0.8053 - val_acc: 0.6472
Epoch 17/30
5605/5605 [==============================] - 63s 11ms/step - loss: 0.2590 - acc: 0.8944 - val_loss: 0.9162 - val_acc: 0.5786
Epoch 18/30
5605/5605 [==============================] - 54s 10ms/step - loss: 0.2493 - acc: 0.8951 - val_loss: 0.8483 - val_acc: 0.5803
Epoch 19/30
5605/5605 [==============================] - 60s 11ms/step - loss: 0.2438 - acc: 0.8969 - val_loss: 0.9386 - val_acc: 0.5485
Epoch 20/30
5605/5605 [==============================] - 68s 12ms/step - loss: 0.2394 - acc: 0.8999 - val_loss: 0.8595 - val_acc: 0.6020
Epoch 21/30
5605/5605 [==============================] - 57s 10ms/step - loss: 0.2356 - acc: 0.8972 - val_loss: 1.0282 - val_acc: 0.5301
Epoch 22/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2334 - acc: 0.8988 - val_loss: 1.0866 - val_acc: 0.5151
Epoch 23/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2304 - acc: 0.9013 - val_loss: 0.9207 - val_acc: 0.5485
Epoch 24/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2257 - acc: 0.8981 - val_loss: 0.9469 - val_acc: 0.5418
Epoch 25/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2231 - acc: 0.9008 - val_loss: 0.9997 - val_acc: 0.5050
Epoch 26/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2191 - acc: 0.9004 - val_loss: 0.9087 - val_acc: 0.5050
Epoch 27/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2144 - acc: 0.9051 - val_loss: 0.9829 - val_acc: 0.5602
Epoch 28/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2107 - acc: 0.9017 - val_loss: 0.9858 - val_acc: 0.4883
Epoch 29/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2125 - acc: 0.9012 - val_loss: 0.9735 - val_acc: 0.5184
Epoch 30/30
5605/5605 [==============================] - 52s 9ms/step - loss: 0.2108 - acc: 0.9028 - val_loss: 1.0839 - val_acc: 0.5234
Test size:  (2588, 43)
2588/2588 [==============================] - 2s 695us/step
Testing loss: 1.2221; Testing Accuracy: 45.13%

#2. dockerを自力で構築する方へ

ここから下は、上記のpullしていただいたdockerをどういう方針で、どういう手順で作ったかを記録します。
上記のdockerを利用する上での参考資料です。本の続きを実行する上では必要ありません。
自力でdocker/anacondaを構築する場合の手順になります。
dockerfileを作る方法ではありません。ごめんなさい。

docker

ubuntu, debianなどのLinuxを、linux, windows, mac osから共通に利用できる仕組み。
利用するOSの設定を変更せずに利用できるのがよい。
同じ仕様で、大量の人が利用することができる。
ソフトウェアの開発元が公式に対応しているものと、利用者が便利に仕立てたものの両方が利用可能である。今回は、公式に配布しているものを、自分で仕立てて、他の人にも利用できるようにする。

python

DeepLearningの実習をPhthonで行って来た。
pythonを使う理由は、多くの機械学習の仕組みがpythonで利用できることと、Rなどの統計解析の仕組みもpythonから容易に利用できることがある。

anaconda

pythonには、2と3という版の違いと、配布方法の違いなどがある。
Anacondaでpython3をこの1年半利用してきた。
Anacondaを利用した理由は、統計解析のライブラリと、JupyterNotebookが初めから入っているからである。

docker公式配布

ubuntu, debianなどのOSの公式配布,gcc, anacondaなどの言語の公式配布などがある。
これらを利用し、docker-hubに登録することにより、公式配布の質の確認と、変更権を含む幅広い情報の共有ができる。dockerが公式配布するものではなく、それぞれのソフト提供者の公式配布という意味。

docker pull

docker公式配布の利用は、URLからpullすることで実現する。

docker Anaconda

anacondaが公式配布しているものを利用。

$  docker pull kaizenjapan/anaconda-keras
Using default tag: latest
latest: Pulling from continuumio/anaconda3
Digest: sha256:e07b9ca98ac1eeb1179dbf0e0bbcebd87701f8654878d6d8ce164d71746964d1
Status: Image is up to date for continuumio/anaconda3:latest

$ docker run -it -p 8888:8888 continuumio/anaconda3 /bin/bash

実際にはkeras, tensorflow を利用していた他のpushをpull

apt

(base) root@d8857ae56e69:/# apt update; apt -y upgrade

(base) root@d8857ae56e69:/# apt-get install -y procps vim apt-utils sudo

ソース git

(base) root@f19e2f06eabb:/# git clone https://github.com/PacktPublishing/Deep-Learning-with-Theano

conda

# conda update --prefix /opt/conda anaconda
Solving environment: done

# conda install theano

pip

(base) root@f19e2f06eabb:/d# pip install --upgrade pip
Collecting pip
  Downloading https://files.pythonhosted.org/packages/5f/25/e52d3f31441505a5f3af41213346e5b6c221c9e086a166f3703d2ddaf940/pip-18.0-py2.py3-none-any.whl (1.3MB)
    100% |████████████████████████████████| 1.3MB 2.0MB/s 
distributed 1.21.8 requires msgpack, which is not installed.
Installing collected packages: pip
  Found existing installation: pip 10.0.1
    Uninstalling pip-10.0.1:
      Successfully uninstalled pip-10.0.1
Successfully installed pip-18.0

docker hubへの登録

$ docker ps
CONTAINER ID        IMAGE                   COMMAND                  CREATED             STATUS              PORTS                    NAMES
caef766a99ff        continuumio/anaconda3   "/usr/bin/tini -- /b…"   10 hours ago        Up 10 hours         0.0.0.0:8888->8888/tcp   sleepy_bassi

$ docker commit caef766a99ff kaizenjapan/anaconda-christopher

$ docker push kaizenjapan/anaconda-christopher 

参考資料(reference)

なぜdockerで機械学習するか 書籍・ソース一覧作成中 (目標100)
https://qiita.com/kaizen_nagoya/items/ddd12477544bf5ba85e2

dockerで機械学習(1) with anaconda(1)「ゼロから作るDeep Learning - Pythonで学ぶディープラーニングの理論と実装」斎藤 康毅 著
https://qiita.com/kaizen_nagoya/items/a7e94ef6dca128d035ab

dockerで機械学習(2)with anaconda(2)「ゼロから作るDeep Learning2自然言語処理編」斎藤 康毅 著
https://qiita.com/kaizen_nagoya/items/3b80dfc76933cea522c6

dockerで機械学習(3)with anaconda(3)「直感Deep Learning」Antonio Gulli、Sujit Pal 第1章,第2章
https://qiita.com/kaizen_nagoya/items/483ae708c71c88419c32

dockerで機械学習(71) 環境構築(1) docker どっかーら、どーやってもエラーばっかり。
https://qiita.com/kaizen_nagoya/items/690d806a4760d9b9e040

dockerで機械学習(72) 環境構築(2) Docker for Windows
https://qiita.com/kaizen_nagoya/items/c4daa5cf52e9f0c2c002

dockerで機械学習(73) 環境構築(3) docker/linux/macos bash スクリプト, ms-dos batchファイル
https://qiita.com/kaizen_nagoya/items/3f7b39110b7f303a5558

dockerで機械学習(74) 環境構築(4) R 難関いくつ?
https://qiita.com/kaizen_nagoya/items/5fb44773bc38574bcf1c

dockerで機械学習(75)環境構築(5)docker関連ファイルの管理
https://qiita.com/kaizen_nagoya/items/4f03df9a42c923087b5d

OpenCVをPythonで動かそうとしてlibGL.soが無いって言われたけど解決した。
https://qiita.com/toshitanian/items/5da24c0c0bd473d514c8

サーバサイドにおけるmatplotlibによる作図Tips
https://qiita.com/TomokIshii/items/3a26ee4453f535a69e9e

Dockerでホストとコンテナ間でのファイルコピー
https://qiita.com/gologo13/items/7e4e404af80377b48fd5

Docker for Mac でファイル共有を利用する
https://qiita.com/seijimomoto/items/1992d68de8baa7e29bb5

「名古屋のIoTは名古屋のOSで」Dockerをどっかーらどうやって使えばいいんでしょう。TOPPERS/FMP on RaspberryPi with Macintosh編 5つの関門
https://qiita.com/kaizen_nagoya/items/9c46c6da8ceb64d2d7af

64bitCPUへの道 and/or 64歳の決意
https://qiita.com/kaizen_nagoya/items/cfb5ffa24ded23ab3f60

ゼロから作るDeepLearning2自然言語処理編 読書会の進め方(例)
https://qiita.com/kaizen_nagoya/items/025eb3f701b36209302e

Ubuntu 16.04 LTS で NVIDIA Docker を使ってみる
https://blog.amedama.jp/entry/2017/04/03/235901

文書履歴(document history)

ver. 0.10 初稿 20181023

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0