1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

dockerで機械学習書籍・ソースM2:macOS確認中。

Last updated at Posted at 2023-09-30

なぜdockerで機械学習するか 書籍・ソース一覧作成中 (目標100)
https://qiita.com/kaizen_nagoya/items/ddd12477544bf5ba85e2

dockerの動作確認。新たな課題があれば記録。記事の誤植訂正を行う。

title url q:views q:goods q:stocks docker d:star d:down
1 「ゼロから作るDeep Learning - Pythonで学ぶディープラーニングの理論と実装」斎藤 康毅 著。dockerで機械学習 with anaconda(1) https://qiita.com/kaizen_nagoya/items/a7e94ef6dca128d035ab 5969 4 15 https://hub.docker.com/repository/docker/kaizenjapan/anaconda-deep-1/ 0 141
2 dockerで機械学習(2)with anaconda(2)「ゼロから作るDeep Learning2自然言語処理編」斎藤 康毅 著https://qiita.com/kaizen_nagoya/items/3b80dfc76933cea522c6 5402 10 12 https://hub.docker.com/r/kaizenjapan/anaconda-deep 0 95
3 「直感Deep Learning」Antonio Gulli, Sujit Pal著 dockerで機械学習(3) with anaconda(3) https://qiita.com/kaizen_nagoya/items/483ae708c71c88419c32 37478 42 56 https://hub.docker.com/r/kaizenjapan/anaconda-keras 0 127

1

bash
$ docker run -it kaizenjapan/anaconda-deep-1 /bin/bash
Unable to find image 'kaizenjapan/anaconda-deep-1:latest' locally
latest: Pulling from kaizenjapan/anaconda-deep-1
cc1a78bfd46b: Pull complete 
314b82d3c9fe: Pull complete 
adebea299011: Pull complete 
f7baff790e81: Pull complete 
efa06e5f16f5: Pull complete 
1b7e359a3cb6: Pull complete 
Digest: sha256:32902dcafb18864aee025b82f472a75065223b32903aaf0ee4f49781a66b534b
Status: Downloaded newer image for kaizenjapan/anaconda-deep-1:latest
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested

起動したのか?

bash
# cd deep-learning-from-scratch/
# cd ch01
# python man.py
Initilized!
Hello David!
Good-bye David!

おお、動いた。

2

bash
$ docker run -it kaizenjapan/anaconda-deep /bin/bash
Unable to find image 'kaizenjapan/anaconda-deep:latest' locally
latest: Pulling from kaizenjapan/anaconda-deep
cc1a78bfd46b: Already exists 
314b82d3c9fe: Already exists 
adebea299011: Already exists 
f7baff790e81: Already exists 
14177dba66b0: Pull complete 
6221d8fe039f: Pull complete 
Digest: sha256:f2ad7a23b438dbefed691fcd56063509168229cf3e4e475b0a2fb5f7bde437b4
Status: Downloaded newer image for kaizenjapan/anaconda-deep:latest
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested

なにがうれしいって、上の4つは、すでに存在していてダウンロードしなくてもいいこと。
コンポーネント様さま。起動。

bash
(base) root@e2a819a4c7bb:/# cd deep-learning-from-scratch-2/
(base) root@e2a819a4c7bb:/deep-learning-from-scratch-2# cd ch01
(base) root@e2a819a4c7bb:/deep-learning-from-scratch-2/ch01# python forward_net.py 
[[ 0.08721473 -0.38508087  0.26882016]
 [ 0.07294879 -0.46098306  0.41678154]
 [ 0.05475137 -0.38658928  0.27231017]
 [ 0.00322351 -0.42973594  0.27612328]
 [-0.61642203 -0.67760151  0.26483961]
 [-0.06321613 -0.45013971  0.28064339]
 [-0.24093733 -0.55098392  0.27910076]
 [-0.67991937 -0.7121916   0.25886429]
 [ 0.00399074 -0.42747284  0.29108553]
 [ 0.08926864 -0.44693342  0.38820465]]

おお、値が違う。理由は調査中。

3

なにがうれしいって、上の4つは、すでに存在していてダウンロードしなくてもいいことは同じ。やっていたことに意味があったことを確認。コンポーネント様さま。起動。

$ docker run -it -p 8888:8888 -p 6006:6006 kaizenjapan/anaconda-keras /bin/bash
Unable to find image 'kaizenjapan/anaconda-keras:latest' locally
latest: Pulling from kaizenjapan/anaconda-keras
cc1a78bfd46b: Already exists 
314b82d3c9fe: Already exists 
adebea299011: Already exists 
f7baff790e81: Already exists 
44c478e462a0: Pull complete 
d058cda2eda9: Pull complete 
Digest: sha256:963787c61d17778b0dc3aaf2eec4be801d0f2825f066c0cbba7074fec29c58e6
Status: Downloaded newer image for kaizenjapan/anaconda-keras:latest
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
(base) root@c3f856cc5cd8:/# cd deep-learning-with-keras-ja/
(base) root@c3f856cc5cd8:/deep-learning-with-keras-ja# python keras_MINST_V1.py 
python: can't open file 'keras_MINST_V1.py': [Errno 2] No such file or directory
(base) root@c3f856cc5cd8:/deep-learning-with-keras-ja# ls
README.md  ch02  ch04  ch06  ch08
ch01	   ch03  ch05  ch07  deep-learning-with-keras-ja.png
(base) root@c3f856cc5cd8:/deep-learning-with-keras-ja# cd ch01
(base) root@c3f856cc5cd8:/deep-learning-with-keras-ja/ch01# python keras_MINST_V1.py 
/opt/conda/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.
  from ._conv import register_converters as _register_converters
Using TensorFlow backend.
60000 train samples
10000 test samples
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 10)                7850      
_________________________________________________________________
activation_1 (Activation)    (None, 10)                0         
=================================================================
Total params: 7,850
Trainable params: 7,850
Non-trainable params: 0
_________________________________________________________________
Train on 48000 samples, validate on 12000 samples
2023-10-01 00:00:25.605633: I tensorflow/core/platform/cpu_feature_guard.cc:141] Your CPU supports instructions that this TensorFlow binary was not compiled to use: SSE4.1 SSE4.2
2023-10-01 00:00:25.608803: I tensorflow/core/common_runtime/process_util.cc:69] Creating new thread pool with default inter op setting: 2. Tune using inter_op_parallelism_threads for best performance.
Epoch 1/200
48000/48000 [==============================] - 13s 262us/step - loss: 1.3633 - acc: 0.6796 - val_loss: 0.8904 - val_acc: 0.8246
Epoch 2/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.7913 - acc: 0.8272 - val_loss: 0.6572 - val_acc: 0.8546
Epoch 3/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.6436 - acc: 0.8497 - val_loss: 0.5625 - val_acc: 0.8681
Epoch 4/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.5717 - acc: 0.8602 - val_loss: 0.5098 - val_acc: 0.8765
Epoch 5/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.5276 - acc: 0.8678 - val_loss: 0.4758 - val_acc: 0.8826
Epoch 6/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.4973 - acc: 0.8726 - val_loss: 0.4515 - val_acc: 0.8866
Epoch 7/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.4748 - acc: 0.8775 - val_loss: 0.4333 - val_acc: 0.8882
Epoch 8/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.4574 - acc: 0.8803 - val_loss: 0.4189 - val_acc: 0.8920
Epoch 9/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.4433 - acc: 0.8834 - val_loss: 0.4075 - val_acc: 0.8939
Epoch 10/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.4317 - acc: 0.8850 - val_loss: 0.3977 - val_acc: 0.8966
Epoch 11/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.4218 - acc: 0.8873 - val_loss: 0.3896 - val_acc: 0.8984
Epoch 12/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.4134 - acc: 0.8888 - val_loss: 0.3827 - val_acc: 0.8995
Epoch 13/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.4060 - acc: 0.8902 - val_loss: 0.3766 - val_acc: 0.9003
Epoch 14/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3995 - acc: 0.8918 - val_loss: 0.3712 - val_acc: 0.9013
Epoch 15/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3936 - acc: 0.8928 - val_loss: 0.3664 - val_acc: 0.9016
Epoch 16/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3884 - acc: 0.8945 - val_loss: 0.3621 - val_acc: 0.9031
Epoch 17/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3837 - acc: 0.8950 - val_loss: 0.3582 - val_acc: 0.9033
Epoch 18/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3794 - acc: 0.8962 - val_loss: 0.3546 - val_acc: 0.9039
Epoch 19/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3755 - acc: 0.8970 - val_loss: 0.3514 - val_acc: 0.9048
Epoch 20/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3718 - acc: 0.8979 - val_loss: 0.3485 - val_acc: 0.9053
Epoch 21/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3685 - acc: 0.8985 - val_loss: 0.3457 - val_acc: 0.9058
Epoch 22/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3653 - acc: 0.8995 - val_loss: 0.3431 - val_acc: 0.9058
Epoch 23/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3625 - acc: 0.8999 - val_loss: 0.3407 - val_acc: 0.9063
Epoch 24/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3598 - acc: 0.9008 - val_loss: 0.3385 - val_acc: 0.9070
Epoch 25/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3572 - acc: 0.9012 - val_loss: 0.3364 - val_acc: 0.9074
Epoch 26/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3548 - acc: 0.9019 - val_loss: 0.3345 - val_acc: 0.9084
Epoch 27/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3525 - acc: 0.9022 - val_loss: 0.3326 - val_acc: 0.9082
Epoch 28/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3504 - acc: 0.9032 - val_loss: 0.3311 - val_acc: 0.9090
Epoch 29/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3484 - acc: 0.9031 - val_loss: 0.3293 - val_acc: 0.9094
Epoch 30/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3465 - acc: 0.9041 - val_loss: 0.3277 - val_acc: 0.9097
Epoch 31/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3447 - acc: 0.9044 - val_loss: 0.3264 - val_acc: 0.9097
Epoch 32/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.3430 - acc: 0.9047 - val_loss: 0.3249 - val_acc: 0.9097
Epoch 33/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3413 - acc: 0.9051 - val_loss: 0.3235 - val_acc: 0.9103
Epoch 34/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3397 - acc: 0.9056 - val_loss: 0.3222 - val_acc: 0.9104
Epoch 35/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3382 - acc: 0.9058 - val_loss: 0.3211 - val_acc: 0.9110
Epoch 36/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3368 - acc: 0.9062 - val_loss: 0.3198 - val_acc: 0.9110
Epoch 37/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3353 - acc: 0.9069 - val_loss: 0.3187 - val_acc: 0.9117
Epoch 38/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3340 - acc: 0.9075 - val_loss: 0.3177 - val_acc: 0.9120
Epoch 39/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3327 - acc: 0.9075 - val_loss: 0.3166 - val_acc: 0.9122
Epoch 40/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.3314 - acc: 0.9078 - val_loss: 0.3159 - val_acc: 0.9118
Epoch 41/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3303 - acc: 0.9080 - val_loss: 0.3147 - val_acc: 0.9127
Epoch 42/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3291 - acc: 0.9084 - val_loss: 0.3138 - val_acc: 0.9132
Epoch 43/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3280 - acc: 0.9089 - val_loss: 0.3130 - val_acc: 0.9132
Epoch 44/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3270 - acc: 0.9091 - val_loss: 0.3121 - val_acc: 0.9132
Epoch 45/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3259 - acc: 0.9093 - val_loss: 0.3113 - val_acc: 0.9135
Epoch 46/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3249 - acc: 0.9095 - val_loss: 0.3105 - val_acc: 0.9137
Epoch 47/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3239 - acc: 0.9105 - val_loss: 0.3098 - val_acc: 0.9141
Epoch 48/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3230 - acc: 0.9105 - val_loss: 0.3090 - val_acc: 0.9146
Epoch 49/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3221 - acc: 0.9102 - val_loss: 0.3083 - val_acc: 0.9151
Epoch 50/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3212 - acc: 0.9109 - val_loss: 0.3075 - val_acc: 0.9150
Epoch 51/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3204 - acc: 0.9109 - val_loss: 0.3070 - val_acc: 0.9150
Epoch 52/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3195 - acc: 0.9112 - val_loss: 0.3063 - val_acc: 0.9148
Epoch 53/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3187 - acc: 0.9114 - val_loss: 0.3057 - val_acc: 0.9153
Epoch 54/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3180 - acc: 0.9117 - val_loss: 0.3050 - val_acc: 0.9148
Epoch 55/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3171 - acc: 0.9121 - val_loss: 0.3044 - val_acc: 0.9149
Epoch 56/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3164 - acc: 0.9121 - val_loss: 0.3037 - val_acc: 0.9156
Epoch 57/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3157 - acc: 0.9128 - val_loss: 0.3034 - val_acc: 0.9152
Epoch 58/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3149 - acc: 0.9121 - val_loss: 0.3029 - val_acc: 0.9148
Epoch 59/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3143 - acc: 0.9128 - val_loss: 0.3022 - val_acc: 0.9151
Epoch 60/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3136 - acc: 0.9129 - val_loss: 0.3016 - val_acc: 0.9161
Epoch 61/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3130 - acc: 0.9133 - val_loss: 0.3011 - val_acc: 0.9158
Epoch 62/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3123 - acc: 0.9131 - val_loss: 0.3007 - val_acc: 0.9151
Epoch 63/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3117 - acc: 0.9136 - val_loss: 0.3003 - val_acc: 0.9156
Epoch 64/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3110 - acc: 0.9137 - val_loss: 0.2997 - val_acc: 0.9158
Epoch 65/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3105 - acc: 0.9137 - val_loss: 0.2992 - val_acc: 0.9159
Epoch 66/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3098 - acc: 0.9138 - val_loss: 0.2988 - val_acc: 0.9161
Epoch 67/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3093 - acc: 0.9141 - val_loss: 0.2983 - val_acc: 0.9165
Epoch 68/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3087 - acc: 0.9139 - val_loss: 0.2979 - val_acc: 0.9166
Epoch 69/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3082 - acc: 0.9144 - val_loss: 0.2976 - val_acc: 0.9164
Epoch 70/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.3077 - acc: 0.9145 - val_loss: 0.2971 - val_acc: 0.9166
Epoch 71/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3071 - acc: 0.9146 - val_loss: 0.2967 - val_acc: 0.9172
Epoch 72/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3066 - acc: 0.9147 - val_loss: 0.2964 - val_acc: 0.9167
Epoch 73/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3061 - acc: 0.9151 - val_loss: 0.2960 - val_acc: 0.9169
Epoch 74/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.3056 - acc: 0.9150 - val_loss: 0.2956 - val_acc: 0.9173
Epoch 75/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3051 - acc: 0.9151 - val_loss: 0.2952 - val_acc: 0.9177
Epoch 76/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3046 - acc: 0.9152 - val_loss: 0.2950 - val_acc: 0.9173
Epoch 77/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3042 - acc: 0.9154 - val_loss: 0.2945 - val_acc: 0.9172
Epoch 78/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.3037 - acc: 0.9154 - val_loss: 0.2942 - val_acc: 0.9176
Epoch 79/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3032 - acc: 0.9157 - val_loss: 0.2939 - val_acc: 0.9179
Epoch 80/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.3028 - acc: 0.9156 - val_loss: 0.2936 - val_acc: 0.9177
Epoch 81/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3024 - acc: 0.9157 - val_loss: 0.2933 - val_acc: 0.9179
Epoch 82/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.3019 - acc: 0.9157 - val_loss: 0.2930 - val_acc: 0.9178
Epoch 83/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3015 - acc: 0.9160 - val_loss: 0.2926 - val_acc: 0.9182
Epoch 84/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3011 - acc: 0.9161 - val_loss: 0.2924 - val_acc: 0.9179
Epoch 85/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3007 - acc: 0.9165 - val_loss: 0.2920 - val_acc: 0.9184
Epoch 86/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.3003 - acc: 0.9164 - val_loss: 0.2918 - val_acc: 0.9185
Epoch 87/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2999 - acc: 0.9165 - val_loss: 0.2914 - val_acc: 0.9185
Epoch 88/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2995 - acc: 0.9166 - val_loss: 0.2911 - val_acc: 0.9188
Epoch 89/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2991 - acc: 0.9167 - val_loss: 0.2909 - val_acc: 0.9191
Epoch 90/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2988 - acc: 0.9169 - val_loss: 0.2906 - val_acc: 0.9191
Epoch 91/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2984 - acc: 0.9168 - val_loss: 0.2903 - val_acc: 0.9192
Epoch 92/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2981 - acc: 0.9170 - val_loss: 0.2901 - val_acc: 0.9196
Epoch 93/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2977 - acc: 0.9171 - val_loss: 0.2898 - val_acc: 0.9195
Epoch 94/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2973 - acc: 0.9172 - val_loss: 0.2895 - val_acc: 0.9196
Epoch 95/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2970 - acc: 0.9174 - val_loss: 0.2894 - val_acc: 0.9196
Epoch 96/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2967 - acc: 0.9174 - val_loss: 0.2891 - val_acc: 0.9198
Epoch 97/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2963 - acc: 0.9176 - val_loss: 0.2889 - val_acc: 0.9197
Epoch 98/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2960 - acc: 0.9174 - val_loss: 0.2886 - val_acc: 0.9202
Epoch 99/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2957 - acc: 0.9176 - val_loss: 0.2884 - val_acc: 0.9202
Epoch 100/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2953 - acc: 0.9178 - val_loss: 0.2882 - val_acc: 0.9200
Epoch 101/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2950 - acc: 0.9179 - val_loss: 0.2879 - val_acc: 0.9201
Epoch 102/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2947 - acc: 0.9180 - val_loss: 0.2877 - val_acc: 0.9204
Epoch 103/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2944 - acc: 0.9180 - val_loss: 0.2875 - val_acc: 0.9202
Epoch 104/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2941 - acc: 0.9184 - val_loss: 0.2873 - val_acc: 0.9202
Epoch 105/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2938 - acc: 0.9183 - val_loss: 0.2871 - val_acc: 0.9206
Epoch 106/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2935 - acc: 0.9183 - val_loss: 0.2868 - val_acc: 0.9202
Epoch 107/200
48000/48000 [==============================] - 13s 264us/step - loss: 0.2932 - acc: 0.9186 - val_loss: 0.2867 - val_acc: 0.9206
Epoch 108/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2929 - acc: 0.9185 - val_loss: 0.2864 - val_acc: 0.9208
Epoch 109/200
48000/48000 [==============================] - 931s 19ms/step - loss: 0.2927 - acc: 0.9185 - val_loss: 0.2863 - val_acc: 0.9206
Epoch 110/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2923 - acc: 0.9187 - val_loss: 0.2860 - val_acc: 0.9204
Epoch 111/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2921 - acc: 0.9184 - val_loss: 0.2858 - val_acc: 0.9210
Epoch 112/200
48000/48000 [==============================] - 37s 772us/step - loss: 0.2918 - acc: 0.9187 - val_loss: 0.2857 - val_acc: 0.9207
Epoch 113/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2915 - acc: 0.9189 - val_loss: 0.2854 - val_acc: 0.9210
Epoch 114/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2913 - acc: 0.9188 - val_loss: 0.2853 - val_acc: 0.9211
Epoch 115/200
48000/48000 [==============================] - 13s 263us/step - loss: 0.2910 - acc: 0.9189 - val_loss: 0.2852 - val_acc: 0.9205
Epoch 116/200
48000/48000 [==============================] - 117s 2ms/step - loss: 0.2908 - acc: 0.9189 - val_loss: 0.2849 - val_acc: 0.9213
Epoch 117/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2905 - acc: 0.9193 - val_loss: 0.2847 - val_acc: 0.9213
Epoch 118/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2902 - acc: 0.9192 - val_loss: 0.2846 - val_acc: 0.9212
Epoch 119/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2900 - acc: 0.9191 - val_loss: 0.2844 - val_acc: 0.9212
Epoch 120/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2898 - acc: 0.9192 - val_loss: 0.2842 - val_acc: 0.9212
Epoch 121/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2895 - acc: 0.9191 - val_loss: 0.2841 - val_acc: 0.9212
Epoch 122/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2892 - acc: 0.9192 - val_loss: 0.2840 - val_acc: 0.9212
Epoch 123/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2890 - acc: 0.9194 - val_loss: 0.2838 - val_acc: 0.9211
Epoch 124/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2888 - acc: 0.9197 - val_loss: 0.2837 - val_acc: 0.9210
Epoch 125/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2885 - acc: 0.9193 - val_loss: 0.2835 - val_acc: 0.9207
Epoch 126/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2883 - acc: 0.9197 - val_loss: 0.2834 - val_acc: 0.9217
Epoch 127/200
48000/48000 [==============================] - 13s 277us/step - loss: 0.2881 - acc: 0.9194 - val_loss: 0.2832 - val_acc: 0.9212
Epoch 128/200
48000/48000 [==============================] - 13s 281us/step - loss: 0.2879 - acc: 0.9194 - val_loss: 0.2830 - val_acc: 0.9210
Epoch 129/200
48000/48000 [==============================] - 13s 281us/step - loss: 0.2876 - acc: 0.9196 - val_loss: 0.2828 - val_acc: 0.9217
Epoch 130/200
48000/48000 [==============================] - 14s 285us/step - loss: 0.2874 - acc: 0.9197 - val_loss: 0.2826 - val_acc: 0.9216
Epoch 131/200
48000/48000 [==============================] - 14s 284us/step - loss: 0.2871 - acc: 0.9200 - val_loss: 0.2827 - val_acc: 0.9211
Epoch 132/200
48000/48000 [==============================] - 14s 281us/step - loss: 0.2870 - acc: 0.9197 - val_loss: 0.2824 - val_acc: 0.9213
Epoch 133/200
48000/48000 [==============================] - 14s 284us/step - loss: 0.2868 - acc: 0.9198 - val_loss: 0.2823 - val_acc: 0.9216
Epoch 134/200
48000/48000 [==============================] - 14s 282us/step - loss: 0.2866 - acc: 0.9199 - val_loss: 0.2822 - val_acc: 0.9214
Epoch 135/200
48000/48000 [==============================] - 14s 286us/step - loss: 0.2863 - acc: 0.9203 - val_loss: 0.2820 - val_acc: 0.9213
Epoch 136/200
48000/48000 [==============================] - 14s 288us/step - loss: 0.2861 - acc: 0.9196 - val_loss: 0.2818 - val_acc: 0.9215
Epoch 137/200
48000/48000 [==============================] - 14s 285us/step - loss: 0.2859 - acc: 0.9198 - val_loss: 0.2818 - val_acc: 0.9217
Epoch 138/200
48000/48000 [==============================] - 14s 283us/step - loss: 0.2857 - acc: 0.9203 - val_loss: 0.2815 - val_acc: 0.9218
Epoch 139/200
48000/48000 [==============================] - 14s 285us/step - loss: 0.2855 - acc: 0.9203 - val_loss: 0.2814 - val_acc: 0.9215
Epoch 140/200
48000/48000 [==============================] - 14s 299us/step - loss: 0.2853 - acc: 0.9201 - val_loss: 0.2812 - val_acc: 0.9216
Epoch 141/200
48000/48000 [==============================] - 14s 302us/step - loss: 0.2852 - acc: 0.9204 - val_loss: 0.2811 - val_acc: 0.9217
Epoch 142/200
48000/48000 [==============================] - 14s 301us/step - loss: 0.2849 - acc: 0.9201 - val_loss: 0.2810 - val_acc: 0.9217
Epoch 143/200
48000/48000 [==============================] - 15s 303us/step - loss: 0.2848 - acc: 0.9205 - val_loss: 0.2809 - val_acc: 0.9219
Epoch 144/200
48000/48000 [==============================] - 14s 299us/step - loss: 0.2846 - acc: 0.9208 - val_loss: 0.2808 - val_acc: 0.9217
Epoch 145/200
48000/48000 [==============================] - 13s 261us/step - loss: 0.2844 - acc: 0.9207 - val_loss: 0.2806 - val_acc: 0.9221
Epoch 146/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2841 - acc: 0.9206 - val_loss: 0.2806 - val_acc: 0.9220
Epoch 147/200
48000/48000 [==============================] - 12s 260us/step - loss: 0.2840 - acc: 0.9207 - val_loss: 0.2804 - val_acc: 0.9217
Epoch 148/200
48000/48000 [==============================] - 12s 253us/step - loss: 0.2838 - acc: 0.9209 - val_loss: 0.2803 - val_acc: 0.9218
Epoch 149/200
48000/48000 [==============================] - 13s 263us/step - loss: 0.2836 - acc: 0.9208 - val_loss: 0.2802 - val_acc: 0.9216
Epoch 150/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2835 - acc: 0.9210 - val_loss: 0.2800 - val_acc: 0.9225
Epoch 151/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2833 - acc: 0.9210 - val_loss: 0.2799 - val_acc: 0.9226
Epoch 152/200
48000/48000 [==============================] - 14s 284us/step - loss: 0.2831 - acc: 0.9211 - val_loss: 0.2798 - val_acc: 0.9222
Epoch 153/200
48000/48000 [==============================] - 16s 324us/step - loss: 0.2829 - acc: 0.9207 - val_loss: 0.2797 - val_acc: 0.9224
Epoch 154/200
48000/48000 [==============================] - 16s 324us/step - loss: 0.2827 - acc: 0.9209 - val_loss: 0.2796 - val_acc: 0.9222
Epoch 155/200
48000/48000 [==============================] - 15s 314us/step - loss: 0.2826 - acc: 0.9208 - val_loss: 0.2795 - val_acc: 0.9225
Epoch 156/200
48000/48000 [==============================] - 15s 320us/step - loss: 0.2824 - acc: 0.9210 - val_loss: 0.2794 - val_acc: 0.9224
Epoch 157/200
48000/48000 [==============================] - 16s 325us/step - loss: 0.2822 - acc: 0.9210 - val_loss: 0.2793 - val_acc: 0.9224
Epoch 158/200
48000/48000 [==============================] - 16s 326us/step - loss: 0.2821 - acc: 0.9214 - val_loss: 0.2792 - val_acc: 0.9226
Epoch 159/200
48000/48000 [==============================] - 13s 278us/step - loss: 0.2819 - acc: 0.9214 - val_loss: 0.2791 - val_acc: 0.9226
Epoch 160/200
48000/48000 [==============================] - 13s 281us/step - loss: 0.2817 - acc: 0.9213 - val_loss: 0.2790 - val_acc: 0.9225
Epoch 161/200
48000/48000 [==============================] - 14s 283us/step - loss: 0.2816 - acc: 0.9214 - val_loss: 0.2789 - val_acc: 0.9222
Epoch 162/200
48000/48000 [==============================] - 14s 299us/step - loss: 0.2814 - acc: 0.9215 - val_loss: 0.2788 - val_acc: 0.9227
Epoch 163/200
48000/48000 [==============================] - 14s 290us/step - loss: 0.2812 - acc: 0.9213 - val_loss: 0.2787 - val_acc: 0.9225
Epoch 164/200
48000/48000 [==============================] - 15s 304us/step - loss: 0.2811 - acc: 0.9216 - val_loss: 0.2786 - val_acc: 0.9225
Epoch 165/200
48000/48000 [==============================] - 14s 284us/step - loss: 0.2809 - acc: 0.9215 - val_loss: 0.2785 - val_acc: 0.9227
Epoch 166/200
48000/48000 [==============================] - 15s 303us/step - loss: 0.2807 - acc: 0.9216 - val_loss: 0.2784 - val_acc: 0.9225
Epoch 167/200
48000/48000 [==============================] - 13s 273us/step - loss: 0.2806 - acc: 0.9217 - val_loss: 0.2784 - val_acc: 0.9227
Epoch 168/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2804 - acc: 0.9219 - val_loss: 0.2782 - val_acc: 0.9228
Epoch 169/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2803 - acc: 0.9216 - val_loss: 0.2782 - val_acc: 0.9227
Epoch 170/200
48000/48000 [==============================] - 12s 259us/step - loss: 0.2801 - acc: 0.9216 - val_loss: 0.2781 - val_acc: 0.9227
Epoch 171/200
48000/48000 [==============================] - 14s 282us/step - loss: 0.2800 - acc: 0.9220 - val_loss: 0.2780 - val_acc: 0.9226
Epoch 172/200
48000/48000 [==============================] - 13s 269us/step - loss: 0.2798 - acc: 0.9218 - val_loss: 0.2778 - val_acc: 0.9231
Epoch 173/200
48000/48000 [==============================] - 13s 281us/step - loss: 0.2797 - acc: 0.9217 - val_loss: 0.2778 - val_acc: 0.9229
Epoch 174/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2796 - acc: 0.9217 - val_loss: 0.2777 - val_acc: 0.9227
Epoch 175/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2794 - acc: 0.9218 - val_loss: 0.2776 - val_acc: 0.9232
Epoch 176/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2793 - acc: 0.9220 - val_loss: 0.2775 - val_acc: 0.9232
Epoch 177/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2791 - acc: 0.9219 - val_loss: 0.2774 - val_acc: 0.9234
Epoch 178/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2790 - acc: 0.9221 - val_loss: 0.2774 - val_acc: 0.9228
Epoch 179/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2788 - acc: 0.9221 - val_loss: 0.2773 - val_acc: 0.9232
Epoch 180/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2787 - acc: 0.9221 - val_loss: 0.2771 - val_acc: 0.9235
Epoch 181/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2785 - acc: 0.9223 - val_loss: 0.2770 - val_acc: 0.9232
Epoch 182/200
48000/48000 [==============================] - 12s 257us/step - loss: 0.2784 - acc: 0.9220 - val_loss: 0.2769 - val_acc: 0.9231
Epoch 183/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2783 - acc: 0.9223 - val_loss: 0.2769 - val_acc: 0.9231
Epoch 184/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2781 - acc: 0.9223 - val_loss: 0.2768 - val_acc: 0.9230
Epoch 185/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2780 - acc: 0.9224 - val_loss: 0.2767 - val_acc: 0.9233
Epoch 186/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2779 - acc: 0.9223 - val_loss: 0.2766 - val_acc: 0.9236
Epoch 187/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2777 - acc: 0.9224 - val_loss: 0.2766 - val_acc: 0.9233
Epoch 188/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2776 - acc: 0.9226 - val_loss: 0.2765 - val_acc: 0.9236
Epoch 189/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2775 - acc: 0.9225 - val_loss: 0.2764 - val_acc: 0.9235
Epoch 190/200
48000/48000 [==============================] - 12s 255us/step - loss: 0.2773 - acc: 0.9225 - val_loss: 0.2764 - val_acc: 0.9235
Epoch 191/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2772 - acc: 0.9225 - val_loss: 0.2763 - val_acc: 0.9237
Epoch 192/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2770 - acc: 0.9226 - val_loss: 0.2762 - val_acc: 0.9238
Epoch 193/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2770 - acc: 0.9226 - val_loss: 0.2761 - val_acc: 0.9237
Epoch 194/200
48000/48000 [==============================] - 12s 258us/step - loss: 0.2768 - acc: 0.9226 - val_loss: 0.2761 - val_acc: 0.9236
Epoch 195/200
48000/48000 [==============================] - 12s 260us/step - loss: 0.2767 - acc: 0.9231 - val_loss: 0.2760 - val_acc: 0.9239
Epoch 196/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2766 - acc: 0.9226 - val_loss: 0.2758 - val_acc: 0.9241
Epoch 197/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2765 - acc: 0.9229 - val_loss: 0.2758 - val_acc: 0.9242
Epoch 198/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2763 - acc: 0.9231 - val_loss: 0.2758 - val_acc: 0.9236
Epoch 199/200
48000/48000 [==============================] - 12s 254us/step - loss: 0.2762 - acc: 0.9229 - val_loss: 0.2757 - val_acc: 0.9241
Epoch 200/200
48000/48000 [==============================] - 12s 256us/step - loss: 0.2761 - acc: 0.9230 - val_loss: 0.2756 - val_acc: 0.9241
10000/10000 [==============================] - 2s 156us/step

Test score: 0.27738584992289544
Test accuracy: 0.9227
(base) root@c3f856cc5cd8:/deep-learning-with-keras-ja/ch01# 

reinstall

時間が遅い。インストールしなおしてみる。

Last login: Sat Sep 30 21:41:45 on ttys000

The default interactive shell is now zsh.
To update your account to use zsh, please run `chsh -s /bin/zsh`.
For more details, please visit https://support.apple.com/kb/HT208050.
MacBook-Air:~ ogawakiyoshi$ docker run -it -p 8888:8888 -p 6006:6006  continuumio/anaconda3 /bin/bash
Unable to find image 'continuumio/anaconda3:latest' locally
latest: Pulling from continuumio/anaconda3
fc521c5b9835: Pull complete 
9d89763995c5: Pull complete 
Digest: sha256:b60631636309ed40a3bc01edc326128aeadfa50622da76052abc9ef2e1d3c8cc
Status: Downloaded newer image for continuumio/anaconda3:latest
docker: Error response from daemon: driver failed programming external connectivity on endpoint pensive_wozniak (7543297db09bf9c4e48f56661a421b586fb04c3f53c2279b96448ed58c392cc1): Bind for 0.0.0.0:8888 failed: port is already allocated.
ERRO[0227] error waiting for container:                 

$ docker run -it -p 8880:8880 -p 6066:6066  continuumio/anaconda3 /bin/bash
(base) root@4842d75f325d:/# apt update; apt -y upgrade
Get:1 http://deb.debian.org/debian bullseye InRelease [116 kB]
Get:2 http://deb.debian.org/debian-security bullseye-security InRelease [48.4 kB]
Get:3 http://deb.debian.org/debian bullseye-updates InRelease [44.1 kB]
Get:4 http://deb.debian.org/debian bullseye/main arm64 Packages [8071 kB]
Get:5 http://deb.debian.org/debian-security bullseye-security/main arm64 Packages [240 kB]
Get:6 http://deb.debian.org/debian bullseye-updates/main arm64 Packages [14.9 kB]
Fetched 8533 kB in 2s (3553 kB/s)                       
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
All packages are up to date.
Reading package lists... Done
Building dependency tree... Done
Reading state information... Done
Calculating upgrade... Done
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
(base) root@4842d75f325d:/# conda install tensorflow
Collecting package metadata (current_repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.
Solving environment: unsuccessful attempt using repodata from current_repodata.json, retrying with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: done

## Package Plan ##

  environment location: /opt/conda

  added / updated specs:
    - tensorflow


The following packages will be downloaded:

    package                    |            build
    ---------------------------|-----------------
    _tflow_select-2.3.0        |              mkl           5 KB
    absl-py-1.4.0              |  py311hd43f75c_0         238 KB
    arrow-cpp-11.0.0           |       h5df12be_2         9.8 MB
    astunparse-1.6.3           |             py_0          17 KB
    aws-c-common-0.5.11        |       h2f4d8fa_0         165 KB
    aws-c-event-stream-0.1.6   |       h22f4aa5_5          25 KB
    aws-checksums-0.1.11       |       h2f4d8fa_1          47 KB
    aws-sdk-cpp-1.8.185        |       h4129968_0         1.9 MB
    blinker-1.4                |  py311hd43f75c_0          28 KB
    cachetools-4.2.2           |     pyhd3eb1b0_0          13 KB
    cryptography-41.0.3        |  py311hdbb800f_0         2.1 MB
    curl-8.1.1                 |       h998d150_0          83 KB
    flatbuffers-2.0.0          |       h419075a_0         1.0 MB
    gast-0.4.0                 |     pyhd3eb1b0_0          13 KB
    google-auth-2.22.0         |  py311hd43f75c_0         231 KB
    google-auth-oauthlib-0.5.2 |  py311hd43f75c_0          29 KB
    google-pasta-0.2.0         |     pyhd3eb1b0_0          46 KB
    grpc-cpp-1.48.2            |       h7de0283_0         4.7 MB
    grpcio-1.48.2              |  py311h7de0283_0         802 KB
    hdf5-1.12.1                |       h6633847_2         4.8 MB
    keras-2.12.0               |  py311hd43f75c_0         2.1 MB
    keras-preprocessing-1.1.2  |     pyhd3eb1b0_0          35 KB
    krb5-1.19.4                |       ha2725d6_0         1.5 MB
    libarchive-3.6.2           |       hf5861eb_1         1.0 MB
    libcurl-8.1.1              |       h79326fa_0         414 KB
    libevent-2.1.12            |       ha9ffb65_0         427 KB
    libmamba-1.4.1             |       hb8fdbf2_0         1.7 MB
    libmambapy-1.4.1           |  py311hb8fdbf2_0         270 KB
    libnghttp2-1.52.0          |       h5192db0_1         730 KB
    libpq-12.9                 |       h140f9b7_3         2.1 MB
    libssh2-1.10.0             |       h581cc77_2         315 KB
    libthrift-0.15.0           |       he7919a5_2         3.9 MB
    numpy-1.23.5               |  py311h1ee0e17_0          10 KB
    numpy-base-1.23.5          |  py311h0572591_0         7.3 MB
    oauthlib-3.2.2             |  py311hd43f75c_0         240 KB
    openssl-1.1.1w             |       h2f4d8fa_0         3.7 MB
    opt_einsum-3.3.0           |     pyhd3eb1b0_1          57 KB
    protobuf-3.20.3            |  py311h419075a_0         387 KB
    pycurl-7.45.2              |  py311h581cc77_0         135 KB
    python-3.11.5              |       h89984f6_0        15.4 MB
    python-flatbuffers-2.0     |     pyhd3eb1b0_0          34 KB
    qt-main-5.15.2             |       h22a4792_8        53.2 MB
    requests-oauthlib-1.3.0    |             py_0          23 KB
    rsa-4.7.2                  |     pyhd3eb1b0_1          28 KB
    tensorboard-2.12.1         |  py311hd43f75c_0         5.6 MB
    tensorboard-data-server-0.7.0|  py311h622d8a8_0          16 KB
    tensorboard-plugin-wit-1.8.1|  py311hd43f75c_0         694 KB
    tensorflow-2.12.0          |mkl_py311h68d252a_0           5 KB
    tensorflow-base-2.12.0     |mkl_py311h1f3075e_0       127.3 MB
    tensorflow-estimator-2.12.0|  py311hd43f75c_0         650 KB
    termcolor-2.1.0            |  py311hd43f75c_0          13 KB
    tokenizers-0.13.2          |  py311h16a14c6_1         4.2 MB
    ------------------------------------------------------------
                                           Total:       259.5 MB

The following NEW packages will be INSTALLED:

  _tflow_select      pkgs/main/linux-aarch64::_tflow_select-2.3.0-mkl 
  absl-py            pkgs/main/linux-aarch64::absl-py-1.4.0-py311hd43f75c_0 
  astunparse         pkgs/main/noarch::astunparse-1.6.3-py_0 
  blinker            pkgs/main/linux-aarch64::blinker-1.4-py311hd43f75c_0 
  cachetools         pkgs/main/noarch::cachetools-4.2.2-pyhd3eb1b0_0 
  flatbuffers        pkgs/main/linux-aarch64::flatbuffers-2.0.0-h419075a_0 
  gast               pkgs/main/noarch::gast-0.4.0-pyhd3eb1b0_0 
  google-auth        pkgs/main/linux-aarch64::google-auth-2.22.0-py311hd43f75c_0 
  google-auth-oauth~ pkgs/main/linux-aarch64::google-auth-oauthlib-0.5.2-py311hd43f75c_0 
  google-pasta       pkgs/main/noarch::google-pasta-0.2.0-pyhd3eb1b0_0 
  grpcio             pkgs/main/linux-aarch64::grpcio-1.48.2-py311h7de0283_0 
  keras              pkgs/main/linux-aarch64::keras-2.12.0-py311hd43f75c_0 
  keras-preprocessi~ pkgs/main/noarch::keras-preprocessing-1.1.2-pyhd3eb1b0_0 
  oauthlib           pkgs/main/linux-aarch64::oauthlib-3.2.2-py311hd43f75c_0 
  opt_einsum         pkgs/main/noarch::opt_einsum-3.3.0-pyhd3eb1b0_1 
  protobuf           pkgs/main/linux-aarch64::protobuf-3.20.3-py311h419075a_0 
  python-flatbuffers pkgs/main/noarch::python-flatbuffers-2.0-pyhd3eb1b0_0 
  requests-oauthlib  pkgs/main/noarch::requests-oauthlib-1.3.0-py_0 
  rsa                pkgs/main/noarch::rsa-4.7.2-pyhd3eb1b0_1 
  tensorboard        pkgs/main/linux-aarch64::tensorboard-2.12.1-py311hd43f75c_0 
  tensorboard-data-~ pkgs/main/linux-aarch64::tensorboard-data-server-0.7.0-py311h622d8a8_0 
  tensorboard-plugi~ pkgs/main/linux-aarch64::tensorboard-plugin-wit-1.8.1-py311hd43f75c_0 
  tensorflow         pkgs/main/linux-aarch64::tensorflow-2.12.0-mkl_py311h68d252a_0 
  tensorflow-base    pkgs/main/linux-aarch64::tensorflow-base-2.12.0-mkl_py311h1f3075e_0 
  tensorflow-estima~ pkgs/main/linux-aarch64::tensorflow-estimator-2.12.0-py311hd43f75c_0 
  termcolor          pkgs/main/linux-aarch64::termcolor-2.1.0-py311hd43f75c_0 

The following packages will be REMOVED:

  cyrus-sasl-2.1.28-h647bc0d_1
  libcups-2.4.2-hb788212_1
  mysql-5.7.24-h3140d82_2

The following packages will be DOWNGRADED:

  arrow-cpp                               11.0.0-h001d45f_2 --> 11.0.0-h5df12be_2 
  aws-c-common                             0.6.8-h998d150_1 --> 0.5.11-h2f4d8fa_0 
  aws-c-event-stream                       0.1.6-h419075a_6 --> 0.1.6-h22f4aa5_5 
  aws-checksums                           0.1.11-h998d150_2 --> 0.1.11-h2f4d8fa_1 
  aws-sdk-cpp                            1.8.185-h3140d82_1 --> 1.8.185-h4129968_0 
  cryptography                       41.0.3-py311h5077475_0 --> 41.0.3-py311hdbb800f_0 
  curl                                     8.2.1-h6ac735f_0 --> 8.1.1-h998d150_0 
  grpc-cpp                                1.48.2-hdefc9b7_1 --> 1.48.2-h7de0283_0 
  hdf5                                    1.12.1-h2117f30_3 --> 1.12.1-h6633847_2 
  krb5                                    1.20.1-h2e2fba8_1 --> 1.19.4-ha2725d6_0 
  libarchive                               3.6.2-h654c02d_2 --> 3.6.2-hf5861eb_1 
  libcurl                                  8.2.1-hfa2bbb0_0 --> 8.1.1-h79326fa_0 
  libevent                                2.1.12-h6ac735f_1 --> 2.1.12-ha9ffb65_0 
  libmamba                                 1.5.1-h78dbd8a_0 --> 1.4.1-hb8fdbf2_0 
  libmambapy                          1.5.1-py311hd82f176_0 --> 1.4.1-py311hb8fdbf2_0 
  libnghttp2                              1.52.0-hb788212_1 --> 1.52.0-h5192db0_1 
  libpq                                    12.15-h6ac735f_1 --> 12.9-h140f9b7_3 
  libssh2                                 1.10.0-h6ac735f_2 --> 1.10.0-h581cc77_2 
  libthrift                               0.15.0-hb2e9abc_2 --> 0.15.0-he7919a5_2 
  numpy                              1.24.3-py311hb7dbe3b_0 --> 1.23.5-py311h1ee0e17_0 
  numpy-base                         1.24.3-py311hb6890e9_0 --> 1.23.5-py311h0572591_0 
  openssl                                 3.0.10-h2f4d8fa_2 --> 1.1.1w-h2f4d8fa_0 
  pycurl                             7.45.2-py311h6ac735f_1 --> 7.45.2-py311h581cc77_0 
  python                                  3.11.5-h4bb2201_0 --> 3.11.5-h89984f6_0 
  qt-main                                 5.15.2-hf18d10e_9 --> 5.15.2-h22a4792_8 
  tokenizers                         0.13.2-py311hb4c1b22_1 --> 0.13.2-py311h16a14c6_1 


Proceed ([y]/n)? y


Downloading and Extracting Packages
                                                                                
Preparing transaction: done                                                     
Verifying transaction: done                                                     
Executing transaction: done                                                     
(base) root@4842d75f325d:/# conda install -y keras
Collecting package metadata (current_repodata.json): done                       
Solving environment: done                                                       
                                                                                
# All requested packages already installed.                                     
                                                                                
(base) root@4842d75f325d:/# git clone https://github.com/oreilly-japan/deep-learning-with-keras-ja.git                                                          
Cloning into 'deep-learning-with-keras-ja'...                                   
remote: Enumerating objects: 244, done.
remote: Counting objects: 100% (123/123), done.
remote: Compressing objects: 100% (101/101), done.
remote: Total 244 (delta 86), reused 20 (delta 20), pack-reused 121
Receiving objects: 100% (244/244), 35.95 MiB | 5.52 MiB/s, done.                
Resolving deltas: 100% (117/117), done.                                         
(base) root@4842d75f325d:/# conda install quiver_engine                         
Collecting package metadata (current_repodata.json): done                       
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.                                                             
Collecting package metadata (repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.

PackagesNotFoundError: The following packages are not available from current channels:

  - quiver_engine

Current channels:

  - https://repo.anaconda.com/pkgs/main/linux-aarch64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/linux-aarch64
  - https://repo.anaconda.com/pkgs/r/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.


(base) root@4842d75f325d:/# 

え。TesorflowにKerasが入ってる?

ところで、
PackagesNotFoundError: The following packages are not available from current channels:
どうしよう。

まとめに変えて

Qiitaの記事のViews、GoodsとDocker Hubのダウンロード数、Star数が比例していない。
まだ3記事だけだから仕方がないが。ちょっと泣ける。

関連資料

' @kazuo_reve 私が効果を確認した「小川メソッド」
https://qiita.com/kazuo_reve/items/a3ea1d9171deeccc04da

' @kazuo_reve 新人の方によく展開している有益な情報
https://qiita.com/kazuo_reve/items/d1a3f0ee48e24bba38f1

' @kazuo_reve Vモデルについて勘違いしていたと思ったこと
https://qiita.com/kazuo_reve/items/46fddb094563bd9b2e1e

自己記事一覧

プログラマが知っていると良い「公序良俗」
https://qiita.com/kaizen_nagoya/items/9fe7c0dfac2fbd77a945

逆も真:社会人が最初に確かめるとよいこと。OSEK(69)、Ethernet(59)
https://qiita.com/kaizen_nagoya/items/39afe4a728a31b903ddc

「何を」よりも「誰を」。10年後のために今見習いたい人たち
https://qiita.com/kaizen_nagoya/items/8045978b16eb49d572b2

Qiitaの記事に3段階または5段階で到達するための方法
https://qiita.com/kaizen_nagoya/items/6e9298296852325adc5e

物理記事 上位100
https://qiita.com/kaizen_nagoya/items/66e90fe31fbe3facc6ff

量子(0) 計算機, 量子力学
https://qiita.com/kaizen_nagoya/items/1cd954cb0eed92879fd4

数学関連記事100
https://qiita.com/kaizen_nagoya/items/d8dadb49a6397e854c6d

統計(0)一覧
https://qiita.com/kaizen_nagoya/items/80d3b221807e53e88aba

図(0) state, sequence and timing. UML and お絵描き
https://qiita.com/kaizen_nagoya/items/60440a882146aeee9e8f

品質一覧
https://qiita.com/kaizen_nagoya/items/2b99b8e9db6d94b2e971

言語・文学記事 100
https://qiita.com/kaizen_nagoya/items/42d58d5ef7fb53c407d6

医工連携関連記事一覧
https://qiita.com/kaizen_nagoya/items/6ab51c12ba51bc260a82

自動車 記事 100
https://qiita.com/kaizen_nagoya/items/f7f0b9ab36569ad409c5

通信記事100
https://qiita.com/kaizen_nagoya/items/1d67de5e1cd207b05ef7

日本語(0)一欄
https://qiita.com/kaizen_nagoya/items/7498dcfa3a9ba7fd1e68

英語(0) 一覧
https://qiita.com/kaizen_nagoya/items/680e3f5cbf9430486c7d

転職(0)一覧
https://qiita.com/kaizen_nagoya/items/f77520d378d33451d6fe

仮説(0)一覧(目標100現在40)
https://qiita.com/kaizen_nagoya/items/f000506fe1837b3590df

音楽 一覧(0)
https://qiita.com/kaizen_nagoya/items/b6e5f42bbfe3bbe40f5d

@kazuo_reve 新人の方によく展開している有益な情報」確認一覧
https://qiita.com/kaizen_nagoya/items/b9380888d1e5a042646b

Qiita(0)Qiita関連記事一覧(自分)
https://qiita.com/kaizen_nagoya/items/58db5fbf036b28e9dfa6

鉄道(0)鉄道のシステム考察はてっちゃんがてつだってくれる
https://qiita.com/kaizen_nagoya/items/26bda595f341a27901a0

安全(0)安全工学シンポジウムに向けて: 21
https://qiita.com/kaizen_nagoya/items/c5d78f3def8195cb2409

一覧の一覧( The directory of directories of mine.) Qiita(100)
https://qiita.com/kaizen_nagoya/items/7eb0e006543886138f39

Ethernet 記事一覧 Ethernet(0)
https://qiita.com/kaizen_nagoya/items/88d35e99f74aefc98794

Wireshark 一覧 wireshark(0)、Ethernet(48)
https://qiita.com/kaizen_nagoya/items/fbed841f61875c4731d0

線網(Wi-Fi)空中線(antenna)(0) 記事一覧(118/300目標)
https://qiita.com/kaizen_nagoya/items/5e5464ac2b24bd4cd001

OSEK OS設計の基礎 OSEK(100)
https://qiita.com/kaizen_nagoya/items/7528a22a14242d2d58a3

Error一覧 error(0)
https://qiita.com/kaizen_nagoya/items/48b6cbc8d68eae2c42b8

++ Support(0) 
https://qiita.com/kaizen_nagoya/items/8720d26f762369a80514

Coding(0) Rules, C, Secure, MISRA and so on
https://qiita.com/kaizen_nagoya/items/400725644a8a0e90fbb0

coding (101) 一覧を作成し始めた。omake:最近のQiitaで表示しない5つの事象
https://qiita.com/kaizen_nagoya/items/20667f09f19598aedb68

プログラマによる、プログラマのための、統計(0)と確率のプログラミングとその後
https://qiita.com/kaizen_nagoya/items/6e9897eb641268766909

なぜdockerで機械学習するか 書籍・ソース一覧作成中 (目標100)
https://qiita.com/kaizen_nagoya/items/ddd12477544bf5ba85e2

言語処理100本ノックをdockerで。python覚えるのに最適。:10+12
https://qiita.com/kaizen_nagoya/items/7e7eb7c543e0c18438c4

プログラムちょい替え(0)一覧:4件
https://qiita.com/kaizen_nagoya/items/296d87ef4bfd516bc394

Python(0)記事をまとめたい。
https://qiita.com/kaizen_nagoya/items/088c57d70ab6904ebb53

官公庁・学校・公的団体(NPOを含む)システムの課題、官(0)
https://qiita.com/kaizen_nagoya/items/04ee6eaf7ec13d3af4c3

「はじめての」シリーズ  ベクタージャパン 
https://qiita.com/kaizen_nagoya/items/2e41634f6e21a3cf74eb

AUTOSAR(0)Qiita記事一覧, OSEK(75)
https://qiita.com/kaizen_nagoya/items/89c07961b59a8754c869

プログラマが知っていると良い「公序良俗」
https://qiita.com/kaizen_nagoya/items/9fe7c0dfac2fbd77a945

LaTeX(0) 一覧 
https://qiita.com/kaizen_nagoya/items/e3f7dafacab58c499792

自動制御、制御工学一覧(0)
https://qiita.com/kaizen_nagoya/items/7767a4e19a6ae1479e6b

Rust(0) 一覧 
https://qiita.com/kaizen_nagoya/items/5e8bb080ba6ca0281927

100以上いいねをいただいた記事16選
https://qiita.com/kaizen_nagoya/items/f8d958d9084ffbd15d2a

小川清最終講義、最終講義(再)計画, Ethernet(100) 英語(100) 安全(100)
https://qiita.com/kaizen_nagoya/items/e2df642e3951e35e6a53

<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>
This article is an individual impression based on my individual experience. It has nothing to do with the organization or business to which I currently belong.

文書履歴(document history)

ver. 0.10 初稿 20231001

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?