0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

PackagesNotFoundError: 今日のconda error(8)

Last updated at Posted at 2023-10-01

dockerで機械学習書籍・ソースM2:macOS確認中。
https://qiita.com/kaizen_nagoya/items/887fa4a2ce9a7f90ca0f

「直感Deep Learning」Antonio Gulli, Sujit Pal著 dockerで機械学習(3) with anaconda(3) https://qiita.com/kaizen_nagoya/items/483ae708c71c88419c32

を再現中に出たエラー

bash
# conda install quiver_engine                         
Collecting package metadata (current_repodata.json): done                       
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.                         
Collecting package metadata (repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.

PackagesNotFoundError: The following packages are not available from current channels:

  - quiver_engine

Current channels:

  - https://repo.anaconda.com/pkgs/main/linux-aarch64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/linux-aarch64
  - https://repo.anaconda.com/pkgs/r/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.

conda

検索するのにログインが必要。
Anacondaに登録した。
これまでながらくお世話になってきた。
感謝しかない。

公AnacondaのDockerがすごい。apt update; apt upgradeが何もないってすごい。毎日更新しているのだろうか。

quiver_engine

Favorites Downloads Artifact (owner / artifact) Platforms
0 1504 anaconda / quiver_engine 0.1.4.1.4 Interactive per-layer visualization for convents in keras conda linux-64 osx-64
0 146 main / quiver_engine 0.1.4.1.4 Interactive per-layer visualization for convents in keras conda linux-64 osx-64
0 46 jjh_cio_testing / quiver_engine Interactive per-layer visualization for convents in keras conda linux-64

conda install -c anaconda quiver_engine

bash
# conda install -c anaconda quiver_engine
Collecting package metadata (current_repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.
Collecting package metadata (repodata.json): done
Solving environment: unsuccessful initial attempt using frozen solve. Retrying with flexible solve.

PackagesNotFoundError: The following packages are not available from current channels:

  - quiver_engine

Current channels:

  - https://conda.anaconda.org/anaconda/linux-aarch64
  - https://conda.anaconda.org/anaconda/noarch
  - https://repo.anaconda.com/pkgs/main/linux-aarch64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/linux-aarch64
  - https://repo.anaconda.com/pkgs/r/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.

bash
#  pip install quiver_engine
Collecting quiver_engine
  Downloading quiver_engine-0.1.4.1.4.tar.gz (398 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 398.7/398.7 kB 5.1 MB/s eta 0:00:00
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [1 lines of output]
      error in quiver_engine setup command: "values of 'package_data' dict" must be a list of strings (got 'quiverboard/dist/*')
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.
bash
(base) root@4842d75f325d:/# ls
bin   deep-learning-with-keras-ja  etc	 lib	mnt  proc  run	 srv  tmp  var
boot  dev			   home  media	opt  root  sbin  sys  usr
(base) root@4842d75f325d:/# cd deep-learning-with-keras-ja/
(base) root@4842d75f325d:/deep-learning-with-keras-ja# ls
README.md  ch02  ch04  ch06  ch08
ch01	   ch03  ch05  ch07  deep-learning-with-keras-ja.png
(base) root@4842d75f325d:/deep-learning-with-keras-ja# cd ch01
(base) root@4842d75f325d:/deep-learning-with-keras-ja/ch01# ls
keras_MINST_V1.py  keras_MINST_V3.py  make_tensorboard.py  requirements_gpu.txt
keras_MINST_V2.py  keras_MINST_V4.py  requirements.txt
(base) root@4842d75f325d:/deep-learning-with-keras-ja/ch01# python keras_MINST_V1.py 
Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz
11490434/11490434 [==============================] - 1s 0us/step
60000 train samples
10000 test samples
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 10)                7850      
                                                                 
 activation (Activation)     (None, 10)                0         
                                                                 
=================================================================
Total params: 7,850
Trainable params: 7,850
Non-trainable params: 0
_________________________________________________________________
Epoch 1/200
375/375 [==============================] - 1s 1ms/step - loss: 1.3734 - accuracy: 0.6789 - val_loss: 0.8889 - val_accuracy: 0.8277
Epoch 2/200
375/375 [==============================] - 0s 699us/step - loss: 0.7901 - accuracy: 0.8283 - val_loss: 0.6550 - val_accuracy: 0.8580
Epoch 3/200
375/375 [==============================] - 0s 696us/step - loss: 0.6422 - accuracy: 0.8502 - val_loss: 0.5608 - val_accuracy: 0.8678
Epoch 4/200
375/375 [==============================] - 0s 685us/step - loss: 0.5706 - accuracy: 0.8608 - val_loss: 0.5085 - val_accuracy: 0.8749
Epoch 5/200
375/375 [==============================] - 0s 679us/step - loss: 0.5269 - accuracy: 0.8677 - val_loss: 0.4747 - val_accuracy: 0.8804
Epoch 6/200
375/375 [==============================] - 0s 668us/step - loss: 0.4969 - accuracy: 0.8726 - val_loss: 0.4507 - val_accuracy: 0.8862
Epoch 7/200
375/375 [==============================] - 0s 669us/step - loss: 0.4745 - accuracy: 0.8770 - val_loss: 0.4326 - val_accuracy: 0.8888
Epoch 8/200
375/375 [==============================] - 0s 677us/step - loss: 0.4573 - accuracy: 0.8804 - val_loss: 0.4187 - val_accuracy: 0.8913
Epoch 9/200
375/375 [==============================] - 0s 678us/step - loss: 0.4433 - accuracy: 0.8825 - val_loss: 0.4069 - val_accuracy: 0.8940
Epoch 10/200
375/375 [==============================] - 0s 689us/step - loss: 0.4318 - accuracy: 0.8851 - val_loss: 0.3973 - val_accuracy: 0.8964
Epoch 11/200
375/375 [==============================] - 0s 693us/step - loss: 0.4220 - accuracy: 0.8869 - val_loss: 0.3892 - val_accuracy: 0.8977
Epoch 12/200
375/375 [==============================] - 0s 690us/step - loss: 0.4136 - accuracy: 0.8884 - val_loss: 0.3822 - val_accuracy: 0.8992
Epoch 13/200
375/375 [==============================] - 0s 689us/step - loss: 0.4062 - accuracy: 0.8901 - val_loss: 0.3763 - val_accuracy: 0.9003
Epoch 14/200
375/375 [==============================] - 0s 691us/step - loss: 0.3998 - accuracy: 0.8915 - val_loss: 0.3709 - val_accuracy: 0.9012
Epoch 15/200
375/375 [==============================] - 0s 693us/step - loss: 0.3940 - accuracy: 0.8928 - val_loss: 0.3661 - val_accuracy: 0.9023
Epoch 16/200
375/375 [==============================] - 0s 692us/step - loss: 0.3888 - accuracy: 0.8934 - val_loss: 0.3619 - val_accuracy: 0.9032
Epoch 17/200
375/375 [==============================] - 0s 691us/step - loss: 0.3841 - accuracy: 0.8949 - val_loss: 0.3578 - val_accuracy: 0.9038
Epoch 18/200
375/375 [==============================] - 0s 690us/step - loss: 0.3798 - accuracy: 0.8957 - val_loss: 0.3543 - val_accuracy: 0.9045
Epoch 19/200
375/375 [==============================] - 0s 693us/step - loss: 0.3760 - accuracy: 0.8965 - val_loss: 0.3510 - val_accuracy: 0.9055
Epoch 20/200
375/375 [==============================] - 0s 696us/step - loss: 0.3723 - accuracy: 0.8973 - val_loss: 0.3481 - val_accuracy: 0.9059
Epoch 21/200
375/375 [==============================] - 0s 697us/step - loss: 0.3690 - accuracy: 0.8981 - val_loss: 0.3454 - val_accuracy: 0.9057
Epoch 22/200
375/375 [==============================] - 0s 733us/step - loss: 0.3659 - accuracy: 0.8988 - val_loss: 0.3428 - val_accuracy: 0.9062
Epoch 23/200
375/375 [==============================] - 0s 695us/step - loss: 0.3630 - accuracy: 0.8994 - val_loss: 0.3405 - val_accuracy: 0.9069
Epoch 24/200
375/375 [==============================] - 0s 694us/step - loss: 0.3603 - accuracy: 0.8998 - val_loss: 0.3382 - val_accuracy: 0.9072
Epoch 25/200
375/375 [==============================] - 0s 729us/step - loss: 0.3577 - accuracy: 0.9004 - val_loss: 0.3363 - val_accuracy: 0.9076
Epoch 26/200
375/375 [==============================] - 0s 698us/step - loss: 0.3553 - accuracy: 0.9011 - val_loss: 0.3343 - val_accuracy: 0.9081
Epoch 27/200
375/375 [==============================] - 0s 722us/step - loss: 0.3531 - accuracy: 0.9017 - val_loss: 0.3325 - val_accuracy: 0.9085
Epoch 28/200
375/375 [==============================] - 0s 691us/step - loss: 0.3510 - accuracy: 0.9021 - val_loss: 0.3307 - val_accuracy: 0.9095
Epoch 29/200
375/375 [==============================] - 0s 688us/step - loss: 0.3489 - accuracy: 0.9024 - val_loss: 0.3289 - val_accuracy: 0.9099
Epoch 30/200
375/375 [==============================] - 0s 695us/step - loss: 0.3470 - accuracy: 0.9034 - val_loss: 0.3274 - val_accuracy: 0.9099
Epoch 31/200
375/375 [==============================] - 0s 703us/step - loss: 0.3452 - accuracy: 0.9034 - val_loss: 0.3260 - val_accuracy: 0.9112
Epoch 32/200
375/375 [==============================] - 0s 711us/step - loss: 0.3434 - accuracy: 0.9040 - val_loss: 0.3247 - val_accuracy: 0.9107
Epoch 33/200
375/375 [==============================] - 0s 682us/step - loss: 0.3418 - accuracy: 0.9043 - val_loss: 0.3231 - val_accuracy: 0.9111
Epoch 34/200
375/375 [==============================] - 0s 689us/step - loss: 0.3402 - accuracy: 0.9044 - val_loss: 0.3219 - val_accuracy: 0.9115
Epoch 35/200
375/375 [==============================] - 0s 693us/step - loss: 0.3387 - accuracy: 0.9051 - val_loss: 0.3206 - val_accuracy: 0.9120
Epoch 36/200
375/375 [==============================] - 0s 692us/step - loss: 0.3372 - accuracy: 0.9058 - val_loss: 0.3196 - val_accuracy: 0.9120
Epoch 37/200
375/375 [==============================] - 0s 686us/step - loss: 0.3358 - accuracy: 0.9058 - val_loss: 0.3184 - val_accuracy: 0.9124
Epoch 38/200
375/375 [==============================] - 0s 694us/step - loss: 0.3345 - accuracy: 0.9062 - val_loss: 0.3173 - val_accuracy: 0.9122
Epoch 39/200
375/375 [==============================] - 0s 692us/step - loss: 0.3332 - accuracy: 0.9064 - val_loss: 0.3163 - val_accuracy: 0.9123
Epoch 40/200
375/375 [==============================] - 0s 701us/step - loss: 0.3320 - accuracy: 0.9068 - val_loss: 0.3154 - val_accuracy: 0.9125
Epoch 41/200
375/375 [==============================] - 0s 685us/step - loss: 0.3308 - accuracy: 0.9072 - val_loss: 0.3144 - val_accuracy: 0.9127
Epoch 42/200
375/375 [==============================] - 0s 689us/step - loss: 0.3296 - accuracy: 0.9073 - val_loss: 0.3135 - val_accuracy: 0.9133
Epoch 43/200
375/375 [==============================] - 0s 697us/step - loss: 0.3285 - accuracy: 0.9081 - val_loss: 0.3126 - val_accuracy: 0.9127
Epoch 44/200
375/375 [==============================] - 0s 693us/step - loss: 0.3274 - accuracy: 0.9082 - val_loss: 0.3118 - val_accuracy: 0.9128
Epoch 45/200
375/375 [==============================] - 0s 688us/step - loss: 0.3264 - accuracy: 0.9087 - val_loss: 0.3109 - val_accuracy: 0.9132
Epoch 46/200
375/375 [==============================] - 0s 690us/step - loss: 0.3254 - accuracy: 0.9089 - val_loss: 0.3101 - val_accuracy: 0.9132
Epoch 47/200
375/375 [==============================] - 0s 693us/step - loss: 0.3244 - accuracy: 0.9091 - val_loss: 0.3094 - val_accuracy: 0.9137
Epoch 48/200
375/375 [==============================] - 0s 692us/step - loss: 0.3235 - accuracy: 0.9094 - val_loss: 0.3087 - val_accuracy: 0.9136
Epoch 49/200
375/375 [==============================] - 0s 689us/step - loss: 0.3225 - accuracy: 0.9098 - val_loss: 0.3081 - val_accuracy: 0.9135
Epoch 50/200
375/375 [==============================] - 0s 687us/step - loss: 0.3217 - accuracy: 0.9098 - val_loss: 0.3072 - val_accuracy: 0.9142
Epoch 51/200
375/375 [==============================] - 0s 699us/step - loss: 0.3207 - accuracy: 0.9099 - val_loss: 0.3067 - val_accuracy: 0.9143
Epoch 52/200
375/375 [==============================] - 0s 696us/step - loss: 0.3200 - accuracy: 0.9108 - val_loss: 0.3059 - val_accuracy: 0.9142
Epoch 53/200
375/375 [==============================] - 0s 693us/step - loss: 0.3191 - accuracy: 0.9110 - val_loss: 0.3053 - val_accuracy: 0.9143
Epoch 54/200
375/375 [==============================] - 0s 737us/step - loss: 0.3184 - accuracy: 0.9108 - val_loss: 0.3047 - val_accuracy: 0.9143
Epoch 55/200
375/375 [==============================] - 0s 697us/step - loss: 0.3176 - accuracy: 0.9109 - val_loss: 0.3040 - val_accuracy: 0.9147
Epoch 56/200
375/375 [==============================] - 0s 692us/step - loss: 0.3168 - accuracy: 0.9115 - val_loss: 0.3035 - val_accuracy: 0.9143
Epoch 57/200
375/375 [==============================] - 0s 695us/step - loss: 0.3161 - accuracy: 0.9115 - val_loss: 0.3029 - val_accuracy: 0.9152
Epoch 58/200
375/375 [==============================] - 0s 691us/step - loss: 0.3154 - accuracy: 0.9121 - val_loss: 0.3024 - val_accuracy: 0.9153
Epoch 59/200
375/375 [==============================] - 0s 706us/step - loss: 0.3147 - accuracy: 0.9121 - val_loss: 0.3018 - val_accuracy: 0.9151
Epoch 60/200
375/375 [==============================] - 0s 732us/step - loss: 0.3140 - accuracy: 0.9125 - val_loss: 0.3013 - val_accuracy: 0.9154
Epoch 61/200
375/375 [==============================] - 0s 702us/step - loss: 0.3134 - accuracy: 0.9126 - val_loss: 0.3008 - val_accuracy: 0.9151
Epoch 62/200
375/375 [==============================] - 0s 696us/step - loss: 0.3127 - accuracy: 0.9126 - val_loss: 0.3003 - val_accuracy: 0.9153
Epoch 63/200
375/375 [==============================] - 0s 707us/step - loss: 0.3121 - accuracy: 0.9132 - val_loss: 0.2998 - val_accuracy: 0.9153
Epoch 64/200
375/375 [==============================] - 0s 700us/step - loss: 0.3115 - accuracy: 0.9133 - val_loss: 0.2994 - val_accuracy: 0.9160
Epoch 65/200
375/375 [==============================] - 0s 696us/step - loss: 0.3108 - accuracy: 0.9134 - val_loss: 0.2989 - val_accuracy: 0.9157
Epoch 66/200
375/375 [==============================] - 0s 696us/step - loss: 0.3103 - accuracy: 0.9136 - val_loss: 0.2984 - val_accuracy: 0.9158
Epoch 67/200
375/375 [==============================] - 0s 695us/step - loss: 0.3097 - accuracy: 0.9137 - val_loss: 0.2980 - val_accuracy: 0.9164
Epoch 68/200
375/375 [==============================] - 0s 699us/step - loss: 0.3091 - accuracy: 0.9138 - val_loss: 0.2976 - val_accuracy: 0.9163
Epoch 69/200
375/375 [==============================] - 0s 691us/step - loss: 0.3086 - accuracy: 0.9141 - val_loss: 0.2972 - val_accuracy: 0.9160
Epoch 70/200
375/375 [==============================] - 0s 758us/step - loss: 0.3080 - accuracy: 0.9144 - val_loss: 0.2967 - val_accuracy: 0.9166
Epoch 71/200
375/375 [==============================] - 0s 698us/step - loss: 0.3075 - accuracy: 0.9145 - val_loss: 0.2964 - val_accuracy: 0.9170
Epoch 72/200
375/375 [==============================] - 0s 700us/step - loss: 0.3069 - accuracy: 0.9148 - val_loss: 0.2960 - val_accuracy: 0.9169
Epoch 73/200
375/375 [==============================] - 0s 692us/step - loss: 0.3065 - accuracy: 0.9146 - val_loss: 0.2956 - val_accuracy: 0.9167
Epoch 74/200
375/375 [==============================] - 0s 699us/step - loss: 0.3060 - accuracy: 0.9147 - val_loss: 0.2952 - val_accuracy: 0.9170
Epoch 75/200
375/375 [==============================] - 0s 697us/step - loss: 0.3055 - accuracy: 0.9151 - val_loss: 0.2949 - val_accuracy: 0.9168
Epoch 76/200
375/375 [==============================] - 0s 698us/step - loss: 0.3050 - accuracy: 0.9152 - val_loss: 0.2945 - val_accuracy: 0.9168
Epoch 77/200
375/375 [==============================] - 0s 694us/step - loss: 0.3045 - accuracy: 0.9153 - val_loss: 0.2941 - val_accuracy: 0.9171
Epoch 78/200
375/375 [==============================] - 0s 705us/step - loss: 0.3041 - accuracy: 0.9153 - val_loss: 0.2938 - val_accuracy: 0.9173
Epoch 79/200
375/375 [==============================] - 0s 701us/step - loss: 0.3036 - accuracy: 0.9155 - val_loss: 0.2934 - val_accuracy: 0.9174
Epoch 80/200
375/375 [==============================] - 0s 702us/step - loss: 0.3032 - accuracy: 0.9154 - val_loss: 0.2931 - val_accuracy: 0.9170
Epoch 81/200
375/375 [==============================] - 0s 695us/step - loss: 0.3027 - accuracy: 0.9159 - val_loss: 0.2928 - val_accuracy: 0.9174
Epoch 82/200
375/375 [==============================] - 0s 694us/step - loss: 0.3023 - accuracy: 0.9157 - val_loss: 0.2925 - val_accuracy: 0.9178
Epoch 83/200
375/375 [==============================] - 0s 699us/step - loss: 0.3018 - accuracy: 0.9159 - val_loss: 0.2922 - val_accuracy: 0.9183
Epoch 84/200
375/375 [==============================] - 0s 694us/step - loss: 0.3015 - accuracy: 0.9159 - val_loss: 0.2919 - val_accuracy: 0.9182
Epoch 85/200
375/375 [==============================] - 0s 693us/step - loss: 0.3010 - accuracy: 0.9159 - val_loss: 0.2916 - val_accuracy: 0.9176
Epoch 86/200
375/375 [==============================] - 0s 695us/step - loss: 0.3006 - accuracy: 0.9161 - val_loss: 0.2913 - val_accuracy: 0.9179
Epoch 87/200
375/375 [==============================] - 0s 700us/step - loss: 0.3002 - accuracy: 0.9163 - val_loss: 0.2911 - val_accuracy: 0.9181
Epoch 88/200
375/375 [==============================] - 0s 693us/step - loss: 0.2999 - accuracy: 0.9161 - val_loss: 0.2908 - val_accuracy: 0.9189
Epoch 89/200
375/375 [==============================] - 0s 694us/step - loss: 0.2995 - accuracy: 0.9159 - val_loss: 0.2905 - val_accuracy: 0.9188
Epoch 90/200
375/375 [==============================] - 0s 692us/step - loss: 0.2991 - accuracy: 0.9164 - val_loss: 0.2902 - val_accuracy: 0.9187
Epoch 91/200
375/375 [==============================] - 0s 701us/step - loss: 0.2987 - accuracy: 0.9162 - val_loss: 0.2900 - val_accuracy: 0.9187
Epoch 92/200
375/375 [==============================] - 0s 696us/step - loss: 0.2984 - accuracy: 0.9163 - val_loss: 0.2896 - val_accuracy: 0.9188
Epoch 93/200
375/375 [==============================] - 0s 702us/step - loss: 0.2980 - accuracy: 0.9168 - val_loss: 0.2894 - val_accuracy: 0.9189
Epoch 94/200
375/375 [==============================] - 0s 694us/step - loss: 0.2977 - accuracy: 0.9168 - val_loss: 0.2892 - val_accuracy: 0.9190
Epoch 95/200
375/375 [==============================] - 0s 691us/step - loss: 0.2973 - accuracy: 0.9168 - val_loss: 0.2890 - val_accuracy: 0.9191
Epoch 96/200
375/375 [==============================] - 0s 700us/step - loss: 0.2970 - accuracy: 0.9165 - val_loss: 0.2887 - val_accuracy: 0.9194
Epoch 97/200
375/375 [==============================] - 0s 699us/step - loss: 0.2966 - accuracy: 0.9169 - val_loss: 0.2884 - val_accuracy: 0.9187
Epoch 98/200
375/375 [==============================] - 0s 733us/step - loss: 0.2963 - accuracy: 0.9170 - val_loss: 0.2882 - val_accuracy: 0.9191
Epoch 99/200
375/375 [==============================] - 0s 701us/step - loss: 0.2960 - accuracy: 0.9171 - val_loss: 0.2879 - val_accuracy: 0.9193
Epoch 100/200
375/375 [==============================] - 0s 698us/step - loss: 0.2956 - accuracy: 0.9175 - val_loss: 0.2878 - val_accuracy: 0.9194
Epoch 101/200
375/375 [==============================] - 0s 728us/step - loss: 0.2953 - accuracy: 0.9172 - val_loss: 0.2875 - val_accuracy: 0.9194
Epoch 102/200
375/375 [==============================] - 0s 702us/step - loss: 0.2950 - accuracy: 0.9175 - val_loss: 0.2874 - val_accuracy: 0.9194
Epoch 103/200
375/375 [==============================] - 0s 724us/step - loss: 0.2947 - accuracy: 0.9175 - val_loss: 0.2870 - val_accuracy: 0.9196
Epoch 104/200
375/375 [==============================] - 0s 698us/step - loss: 0.2944 - accuracy: 0.9177 - val_loss: 0.2869 - val_accuracy: 0.9195
Epoch 105/200
375/375 [==============================] - 0s 692us/step - loss: 0.2941 - accuracy: 0.9181 - val_loss: 0.2867 - val_accuracy: 0.9201
Epoch 106/200
375/375 [==============================] - 0s 700us/step - loss: 0.2938 - accuracy: 0.9180 - val_loss: 0.2865 - val_accuracy: 0.9197
Epoch 107/200
375/375 [==============================] - 0s 693us/step - loss: 0.2935 - accuracy: 0.9180 - val_loss: 0.2862 - val_accuracy: 0.9197
Epoch 108/200
375/375 [==============================] - 0s 730us/step - loss: 0.2932 - accuracy: 0.9182 - val_loss: 0.2860 - val_accuracy: 0.9202
Epoch 109/200
375/375 [==============================] - 0s 691us/step - loss: 0.2929 - accuracy: 0.9185 - val_loss: 0.2859 - val_accuracy: 0.9198
Epoch 110/200
375/375 [==============================] - 0s 696us/step - loss: 0.2926 - accuracy: 0.9182 - val_loss: 0.2856 - val_accuracy: 0.9203
Epoch 111/200
375/375 [==============================] - 0s 697us/step - loss: 0.2924 - accuracy: 0.9185 - val_loss: 0.2854 - val_accuracy: 0.9201
Epoch 112/200
375/375 [==============================] - 0s 700us/step - loss: 0.2921 - accuracy: 0.9187 - val_loss: 0.2853 - val_accuracy: 0.9197
Epoch 113/200
375/375 [==============================] - 0s 689us/step - loss: 0.2918 - accuracy: 0.9186 - val_loss: 0.2851 - val_accuracy: 0.9201
Epoch 114/200
375/375 [==============================] - 0s 696us/step - loss: 0.2915 - accuracy: 0.9189 - val_loss: 0.2849 - val_accuracy: 0.9198
Epoch 115/200
375/375 [==============================] - 0s 696us/step - loss: 0.2913 - accuracy: 0.9187 - val_loss: 0.2848 - val_accuracy: 0.9205
Epoch 116/200
375/375 [==============================] - 0s 705us/step - loss: 0.2910 - accuracy: 0.9191 - val_loss: 0.2845 - val_accuracy: 0.9203
Epoch 117/200
375/375 [==============================] - 0s 694us/step - loss: 0.2908 - accuracy: 0.9190 - val_loss: 0.2844 - val_accuracy: 0.9206
Epoch 118/200
375/375 [==============================] - 0s 697us/step - loss: 0.2905 - accuracy: 0.9191 - val_loss: 0.2842 - val_accuracy: 0.9199
Epoch 119/200
375/375 [==============================] - 0s 695us/step - loss: 0.2902 - accuracy: 0.9191 - val_loss: 0.2841 - val_accuracy: 0.9206
Epoch 120/200
375/375 [==============================] - 0s 698us/step - loss: 0.2900 - accuracy: 0.9193 - val_loss: 0.2839 - val_accuracy: 0.9207
Epoch 121/200
375/375 [==============================] - 0s 690us/step - loss: 0.2898 - accuracy: 0.9193 - val_loss: 0.2838 - val_accuracy: 0.9207
Epoch 122/200
375/375 [==============================] - 0s 693us/step - loss: 0.2895 - accuracy: 0.9194 - val_loss: 0.2836 - val_accuracy: 0.9208
Epoch 123/200
375/375 [==============================] - 0s 694us/step - loss: 0.2893 - accuracy: 0.9193 - val_loss: 0.2834 - val_accuracy: 0.9212
Epoch 124/200
375/375 [==============================] - 0s 696us/step - loss: 0.2891 - accuracy: 0.9198 - val_loss: 0.2832 - val_accuracy: 0.9204
Epoch 125/200
375/375 [==============================] - 0s 696us/step - loss: 0.2888 - accuracy: 0.9197 - val_loss: 0.2830 - val_accuracy: 0.9211
Epoch 126/200
375/375 [==============================] - 0s 692us/step - loss: 0.2886 - accuracy: 0.9200 - val_loss: 0.2829 - val_accuracy: 0.9212
Epoch 127/200
375/375 [==============================] - 0s 694us/step - loss: 0.2884 - accuracy: 0.9199 - val_loss: 0.2827 - val_accuracy: 0.9212
Epoch 128/200
375/375 [==============================] - 0s 695us/step - loss: 0.2881 - accuracy: 0.9199 - val_loss: 0.2826 - val_accuracy: 0.9212
Epoch 129/200
375/375 [==============================] - 0s 694us/step - loss: 0.2879 - accuracy: 0.9198 - val_loss: 0.2824 - val_accuracy: 0.9212
Epoch 130/200
375/375 [==============================] - 0s 690us/step - loss: 0.2876 - accuracy: 0.9199 - val_loss: 0.2824 - val_accuracy: 0.9210
Epoch 131/200
375/375 [==============================] - 0s 698us/step - loss: 0.2875 - accuracy: 0.9200 - val_loss: 0.2821 - val_accuracy: 0.9212
Epoch 132/200
375/375 [==============================] - 0s 695us/step - loss: 0.2872 - accuracy: 0.9202 - val_loss: 0.2821 - val_accuracy: 0.9216
Epoch 133/200
375/375 [==============================] - 0s 693us/step - loss: 0.2870 - accuracy: 0.9202 - val_loss: 0.2818 - val_accuracy: 0.9212
Epoch 134/200
375/375 [==============================] - 0s 691us/step - loss: 0.2868 - accuracy: 0.9205 - val_loss: 0.2817 - val_accuracy: 0.9218
Epoch 135/200
375/375 [==============================] - 0s 704us/step - loss: 0.2866 - accuracy: 0.9201 - val_loss: 0.2815 - val_accuracy: 0.9208
Epoch 136/200
375/375 [==============================] - 0s 730us/step - loss: 0.2864 - accuracy: 0.9203 - val_loss: 0.2815 - val_accuracy: 0.9212
Epoch 137/200
375/375 [==============================] - 0s 697us/step - loss: 0.2861 - accuracy: 0.9203 - val_loss: 0.2814 - val_accuracy: 0.9206
Epoch 138/200
375/375 [==============================] - 0s 696us/step - loss: 0.2860 - accuracy: 0.9205 - val_loss: 0.2812 - val_accuracy: 0.9214
Epoch 139/200
375/375 [==============================] - 0s 704us/step - loss: 0.2858 - accuracy: 0.9206 - val_loss: 0.2810 - val_accuracy: 0.9211
Epoch 140/200
375/375 [==============================] - 0s 701us/step - loss: 0.2856 - accuracy: 0.9206 - val_loss: 0.2810 - val_accuracy: 0.9213
Epoch 141/200
375/375 [==============================] - 0s 694us/step - loss: 0.2854 - accuracy: 0.9207 - val_loss: 0.2808 - val_accuracy: 0.9219
Epoch 142/200
375/375 [==============================] - 0s 697us/step - loss: 0.2852 - accuracy: 0.9208 - val_loss: 0.2807 - val_accuracy: 0.9215
Epoch 143/200
375/375 [==============================] - 0s 696us/step - loss: 0.2850 - accuracy: 0.9209 - val_loss: 0.2805 - val_accuracy: 0.9216
Epoch 144/200
375/375 [==============================] - 0s 698us/step - loss: 0.2848 - accuracy: 0.9207 - val_loss: 0.2804 - val_accuracy: 0.9219
Epoch 145/200
375/375 [==============================] - 0s 697us/step - loss: 0.2846 - accuracy: 0.9210 - val_loss: 0.2804 - val_accuracy: 0.9221
Epoch 146/200
375/375 [==============================] - 0s 757us/step - loss: 0.2844 - accuracy: 0.9210 - val_loss: 0.2802 - val_accuracy: 0.9220
Epoch 147/200
375/375 [==============================] - 0s 687us/step - loss: 0.2842 - accuracy: 0.9211 - val_loss: 0.2801 - val_accuracy: 0.9218
Epoch 148/200
375/375 [==============================] - 0s 697us/step - loss: 0.2841 - accuracy: 0.9210 - val_loss: 0.2799 - val_accuracy: 0.9214
Epoch 149/200
375/375 [==============================] - 0s 696us/step - loss: 0.2838 - accuracy: 0.9210 - val_loss: 0.2799 - val_accuracy: 0.9220
Epoch 150/200
375/375 [==============================] - 0s 698us/step - loss: 0.2837 - accuracy: 0.9210 - val_loss: 0.2798 - val_accuracy: 0.9221
Epoch 151/200
375/375 [==============================] - 0s 694us/step - loss: 0.2834 - accuracy: 0.9209 - val_loss: 0.2797 - val_accuracy: 0.9221
Epoch 152/200
375/375 [==============================] - 0s 698us/step - loss: 0.2833 - accuracy: 0.9215 - val_loss: 0.2795 - val_accuracy: 0.9222
Epoch 153/200
375/375 [==============================] - 0s 694us/step - loss: 0.2832 - accuracy: 0.9216 - val_loss: 0.2794 - val_accuracy: 0.9222
Epoch 154/200
375/375 [==============================] - 0s 704us/step - loss: 0.2829 - accuracy: 0.9215 - val_loss: 0.2793 - val_accuracy: 0.9214
Epoch 155/200
375/375 [==============================] - 0s 698us/step - loss: 0.2828 - accuracy: 0.9216 - val_loss: 0.2792 - val_accuracy: 0.9217
Epoch 156/200
375/375 [==============================] - 0s 699us/step - loss: 0.2826 - accuracy: 0.9214 - val_loss: 0.2791 - val_accuracy: 0.9224
Epoch 157/200
375/375 [==============================] - 0s 693us/step - loss: 0.2824 - accuracy: 0.9216 - val_loss: 0.2789 - val_accuracy: 0.9220
Epoch 158/200
375/375 [==============================] - 0s 697us/step - loss: 0.2823 - accuracy: 0.9214 - val_loss: 0.2788 - val_accuracy: 0.9221
Epoch 159/200
375/375 [==============================] - 0s 697us/step - loss: 0.2821 - accuracy: 0.9220 - val_loss: 0.2787 - val_accuracy: 0.9223
Epoch 160/200
375/375 [==============================] - 0s 696us/step - loss: 0.2819 - accuracy: 0.9220 - val_loss: 0.2787 - val_accuracy: 0.9223
Epoch 161/200
375/375 [==============================] - 0s 697us/step - loss: 0.2818 - accuracy: 0.9217 - val_loss: 0.2785 - val_accuracy: 0.9224
Epoch 162/200
375/375 [==============================] - 0s 690us/step - loss: 0.2816 - accuracy: 0.9217 - val_loss: 0.2786 - val_accuracy: 0.9222
Epoch 163/200
375/375 [==============================] - 0s 696us/step - loss: 0.2814 - accuracy: 0.9219 - val_loss: 0.2783 - val_accuracy: 0.9224
Epoch 164/200
375/375 [==============================] - 0s 697us/step - loss: 0.2812 - accuracy: 0.9218 - val_loss: 0.2783 - val_accuracy: 0.9227
Epoch 165/200
375/375 [==============================] - 0s 694us/step - loss: 0.2811 - accuracy: 0.9220 - val_loss: 0.2781 - val_accuracy: 0.9222
Epoch 166/200
375/375 [==============================] - 0s 693us/step - loss: 0.2810 - accuracy: 0.9221 - val_loss: 0.2781 - val_accuracy: 0.9222
Epoch 167/200
375/375 [==============================] - 0s 694us/step - loss: 0.2808 - accuracy: 0.9221 - val_loss: 0.2780 - val_accuracy: 0.9224
Epoch 168/200
375/375 [==============================] - 0s 699us/step - loss: 0.2807 - accuracy: 0.9218 - val_loss: 0.2778 - val_accuracy: 0.9224
Epoch 169/200
375/375 [==============================] - 0s 697us/step - loss: 0.2805 - accuracy: 0.9222 - val_loss: 0.2777 - val_accuracy: 0.9221
Epoch 170/200
375/375 [==============================] - 0s 692us/step - loss: 0.2803 - accuracy: 0.9220 - val_loss: 0.2776 - val_accuracy: 0.9222
Epoch 171/200
375/375 [==============================] - 0s 697us/step - loss: 0.2802 - accuracy: 0.9225 - val_loss: 0.2776 - val_accuracy: 0.9227
Epoch 172/200
375/375 [==============================] - 0s 699us/step - loss: 0.2801 - accuracy: 0.9223 - val_loss: 0.2775 - val_accuracy: 0.9224
Epoch 173/200
375/375 [==============================] - 0s 708us/step - loss: 0.2799 - accuracy: 0.9223 - val_loss: 0.2774 - val_accuracy: 0.9222
Epoch 174/200
375/375 [==============================] - 0s 721us/step - loss: 0.2798 - accuracy: 0.9225 - val_loss: 0.2773 - val_accuracy: 0.9223
Epoch 175/200
375/375 [==============================] - 0s 702us/step - loss: 0.2796 - accuracy: 0.9222 - val_loss: 0.2772 - val_accuracy: 0.9223
Epoch 176/200
375/375 [==============================] - 0s 700us/step - loss: 0.2795 - accuracy: 0.9225 - val_loss: 0.2771 - val_accuracy: 0.9222
Epoch 177/200
375/375 [==============================] - 0s 731us/step - loss: 0.2793 - accuracy: 0.9226 - val_loss: 0.2771 - val_accuracy: 0.9227
Epoch 178/200
375/375 [==============================] - 0s 701us/step - loss: 0.2792 - accuracy: 0.9226 - val_loss: 0.2769 - val_accuracy: 0.9223
Epoch 179/200
375/375 [==============================] - 0s 729us/step - loss: 0.2790 - accuracy: 0.9222 - val_loss: 0.2769 - val_accuracy: 0.9227
Epoch 180/200
375/375 [==============================] - 0s 696us/step - loss: 0.2789 - accuracy: 0.9225 - val_loss: 0.2768 - val_accuracy: 0.9228
Epoch 181/200
375/375 [==============================] - 0s 692us/step - loss: 0.2788 - accuracy: 0.9225 - val_loss: 0.2767 - val_accuracy: 0.9222
Epoch 182/200
375/375 [==============================] - 0s 698us/step - loss: 0.2786 - accuracy: 0.9226 - val_loss: 0.2766 - val_accuracy: 0.9225
Epoch 183/200
375/375 [==============================] - 0s 698us/step - loss: 0.2785 - accuracy: 0.9228 - val_loss: 0.2766 - val_accuracy: 0.9227
Epoch 184/200
375/375 [==============================] - 0s 729us/step - loss: 0.2784 - accuracy: 0.9228 - val_loss: 0.2764 - val_accuracy: 0.9226
Epoch 185/200
375/375 [==============================] - 0s 693us/step - loss: 0.2782 - accuracy: 0.9226 - val_loss: 0.2764 - val_accuracy: 0.9223
Epoch 186/200
375/375 [==============================] - 0s 696us/step - loss: 0.2780 - accuracy: 0.9227 - val_loss: 0.2764 - val_accuracy: 0.9230
Epoch 187/200
375/375 [==============================] - 0s 698us/step - loss: 0.2780 - accuracy: 0.9229 - val_loss: 0.2762 - val_accuracy: 0.9227
Epoch 188/200
375/375 [==============================] - 0s 698us/step - loss: 0.2778 - accuracy: 0.9230 - val_loss: 0.2762 - val_accuracy: 0.9222
Epoch 189/200
375/375 [==============================] - 0s 691us/step - loss: 0.2776 - accuracy: 0.9228 - val_loss: 0.2761 - val_accuracy: 0.9227
Epoch 190/200
375/375 [==============================] - 0s 699us/step - loss: 0.2775 - accuracy: 0.9230 - val_loss: 0.2761 - val_accuracy: 0.9226
Epoch 191/200
375/375 [==============================] - 0s 693us/step - loss: 0.2773 - accuracy: 0.9230 - val_loss: 0.2761 - val_accuracy: 0.9227
Epoch 192/200
375/375 [==============================] - 0s 700us/step - loss: 0.2773 - accuracy: 0.9230 - val_loss: 0.2758 - val_accuracy: 0.9224
Epoch 193/200
375/375 [==============================] - 0s 693us/step - loss: 0.2771 - accuracy: 0.9231 - val_loss: 0.2758 - val_accuracy: 0.9224
Epoch 194/200
375/375 [==============================] - 0s 699us/step - loss: 0.2770 - accuracy: 0.9230 - val_loss: 0.2757 - val_accuracy: 0.9227
Epoch 195/200
375/375 [==============================] - 0s 698us/step - loss: 0.2769 - accuracy: 0.9230 - val_loss: 0.2756 - val_accuracy: 0.9227
Epoch 196/200
375/375 [==============================] - 0s 698us/step - loss: 0.2768 - accuracy: 0.9231 - val_loss: 0.2756 - val_accuracy: 0.9224
Epoch 197/200
375/375 [==============================] - 0s 691us/step - loss: 0.2766 - accuracy: 0.9229 - val_loss: 0.2755 - val_accuracy: 0.9228
Epoch 198/200
375/375 [==============================] - 0s 694us/step - loss: 0.2765 - accuracy: 0.9234 - val_loss: 0.2755 - val_accuracy: 0.9224
Epoch 199/200
375/375 [==============================] - 0s 697us/step - loss: 0.2764 - accuracy: 0.9231 - val_loss: 0.2754 - val_accuracy: 0.9224
Epoch 200/200
375/375 [==============================] - 0s 696us/step - loss: 0.2762 - accuracy: 0.9232 - val_loss: 0.2753 - val_accuracy: 0.9230
313/313 [==============================] - 0s 352us/step - loss: 0.2769 - accuracy: 0.9227

Test score: 0.27694398164749146
Test accuracy: 0.9226999878883362

# echo $SHELL
/bin/bash

時間は圧倒的には速くなった。

bash
$ docker ps
CONTAINER ID   IMAGE                   COMMAND       CREATED       STATUS       PORTS                                            NAMES
4842d75f325d   continuumio/anaconda3   "/bin/bash"   8 hours ago   Up 8 hours   0.0.0.0:6066->6066/tcp, 0.0.0.0:8880->8880/tcp   festive_antonelli
$ docker commit 4842d75f325d  kaizenjapan/anaconda-tensorflow-m2macOS
invalid reference format: repository name must be lowercase
$ docker commit 4842d75f325d  kaizenjapan/anaconda-tensorflow-m2macos
sha256:ace660cd14b5370d65a5041c860167411557f1216dfe9b1c4d1cde0801e05619
$ docker push kaizenjapan/anaconda-tensoflow-m2macos
Using default tag: latest
The push refers to repository [docker.io/kaizenjapan/anaconda-tensoflow-m2macos]
An image does not exist locally with the tag: kaizenjapan/anaconda-tensoflow-m2macos
$ docker push kaizenjapan/anaconda-tensorflow-m2macos
Using default tag: latest
The push refers to repository [docker.io/kaizenjapan/anaconda-tensorflow-m2macos]
aa2163b9fe7a: Pushed 
8109b71a3b8d: Mounted from continuumio/anaconda3 
6c9ad649ba04: Mounted from continuumio/anaconda3 
latest: digest: sha256:4c98a386ee42a133dc378070fabeb0cb34af62a78b8d892df40b777b323dfaaa size: 956

保存しておいた。

関連資料

' @kazuo_reve 私が効果を確認した「小川メソッド」
https://qiita.com/kazuo_reve/items/a3ea1d9171deeccc04da

' @kazuo_reve 新人の方によく展開している有益な情報
https://qiita.com/kazuo_reve/items/d1a3f0ee48e24bba38f1

' @kazuo_reve Vモデルについて勘違いしていたと思ったこと
https://qiita.com/kazuo_reve/items/46fddb094563bd9b2e1e

自己記事一覧

プログラマが知っていると良い「公序良俗」
https://qiita.com/kaizen_nagoya/items/9fe7c0dfac2fbd77a945

逆も真:社会人が最初に確かめるとよいこと。OSEK(69)、Ethernet(59)
https://qiita.com/kaizen_nagoya/items/39afe4a728a31b903ddc

「何を」よりも「誰を」。10年後のために今見習いたい人たち
https://qiita.com/kaizen_nagoya/items/8045978b16eb49d572b2

Qiitaの記事に3段階または5段階で到達するための方法
https://qiita.com/kaizen_nagoya/items/6e9298296852325adc5e

物理記事 上位100
https://qiita.com/kaizen_nagoya/items/66e90fe31fbe3facc6ff

量子(0) 計算機, 量子力学
https://qiita.com/kaizen_nagoya/items/1cd954cb0eed92879fd4

数学関連記事100
https://qiita.com/kaizen_nagoya/items/d8dadb49a6397e854c6d

統計(0)一覧
https://qiita.com/kaizen_nagoya/items/80d3b221807e53e88aba

図(0) state, sequence and timing. UML and お絵描き
https://qiita.com/kaizen_nagoya/items/60440a882146aeee9e8f

品質一覧
https://qiita.com/kaizen_nagoya/items/2b99b8e9db6d94b2e971

言語・文学記事 100
https://qiita.com/kaizen_nagoya/items/42d58d5ef7fb53c407d6

医工連携関連記事一覧
https://qiita.com/kaizen_nagoya/items/6ab51c12ba51bc260a82

自動車 記事 100
https://qiita.com/kaizen_nagoya/items/f7f0b9ab36569ad409c5

通信記事100
https://qiita.com/kaizen_nagoya/items/1d67de5e1cd207b05ef7

日本語(0)一欄
https://qiita.com/kaizen_nagoya/items/7498dcfa3a9ba7fd1e68

英語(0) 一覧
https://qiita.com/kaizen_nagoya/items/680e3f5cbf9430486c7d

転職(0)一覧
https://qiita.com/kaizen_nagoya/items/f77520d378d33451d6fe

仮説(0)一覧(目標100現在40)
https://qiita.com/kaizen_nagoya/items/f000506fe1837b3590df

音楽 一覧(0)
https://qiita.com/kaizen_nagoya/items/b6e5f42bbfe3bbe40f5d

@kazuo_reve 新人の方によく展開している有益な情報」確認一覧
https://qiita.com/kaizen_nagoya/items/b9380888d1e5a042646b

Qiita(0)Qiita関連記事一覧(自分)
https://qiita.com/kaizen_nagoya/items/58db5fbf036b28e9dfa6

鉄道(0)鉄道のシステム考察はてっちゃんがてつだってくれる
https://qiita.com/kaizen_nagoya/items/26bda595f341a27901a0

安全(0)安全工学シンポジウムに向けて: 21
https://qiita.com/kaizen_nagoya/items/c5d78f3def8195cb2409

一覧の一覧( The directory of directories of mine.) Qiita(100)
https://qiita.com/kaizen_nagoya/items/7eb0e006543886138f39

Ethernet 記事一覧 Ethernet(0)
https://qiita.com/kaizen_nagoya/items/88d35e99f74aefc98794

Wireshark 一覧 wireshark(0)、Ethernet(48)
https://qiita.com/kaizen_nagoya/items/fbed841f61875c4731d0

線網(Wi-Fi)空中線(antenna)(0) 記事一覧(118/300目標)
https://qiita.com/kaizen_nagoya/items/5e5464ac2b24bd4cd001

OSEK OS設計の基礎 OSEK(100)
https://qiita.com/kaizen_nagoya/items/7528a22a14242d2d58a3

Error一覧 error(0)
https://qiita.com/kaizen_nagoya/items/48b6cbc8d68eae2c42b8

++ Support(0) 
https://qiita.com/kaizen_nagoya/items/8720d26f762369a80514

Coding(0) Rules, C, Secure, MISRA and so on
https://qiita.com/kaizen_nagoya/items/400725644a8a0e90fbb0

coding (101) 一覧を作成し始めた。omake:最近のQiitaで表示しない5つの事象
https://qiita.com/kaizen_nagoya/items/20667f09f19598aedb68

プログラマによる、プログラマのための、統計(0)と確率のプログラミングとその後
https://qiita.com/kaizen_nagoya/items/6e9897eb641268766909

なぜdockerで機械学習するか 書籍・ソース一覧作成中 (目標100)
https://qiita.com/kaizen_nagoya/items/ddd12477544bf5ba85e2

言語処理100本ノックをdockerで。python覚えるのに最適。:10+12
https://qiita.com/kaizen_nagoya/items/7e7eb7c543e0c18438c4

プログラムちょい替え(0)一覧:4件
https://qiita.com/kaizen_nagoya/items/296d87ef4bfd516bc394

Python(0)記事をまとめたい。
https://qiita.com/kaizen_nagoya/items/088c57d70ab6904ebb53

官公庁・学校・公的団体(NPOを含む)システムの課題、官(0)
https://qiita.com/kaizen_nagoya/items/04ee6eaf7ec13d3af4c3

「はじめての」シリーズ  ベクタージャパン 
https://qiita.com/kaizen_nagoya/items/2e41634f6e21a3cf74eb

AUTOSAR(0)Qiita記事一覧, OSEK(75)
https://qiita.com/kaizen_nagoya/items/89c07961b59a8754c869

プログラマが知っていると良い「公序良俗」
https://qiita.com/kaizen_nagoya/items/9fe7c0dfac2fbd77a945

LaTeX(0) 一覧 
https://qiita.com/kaizen_nagoya/items/e3f7dafacab58c499792

自動制御、制御工学一覧(0)
https://qiita.com/kaizen_nagoya/items/7767a4e19a6ae1479e6b

Rust(0) 一覧 
https://qiita.com/kaizen_nagoya/items/5e8bb080ba6ca0281927

100以上いいねをいただいた記事16選
https://qiita.com/kaizen_nagoya/items/f8d958d9084ffbd15d2a

小川清最終講義、最終講義(再)計画, Ethernet(100) 英語(100) 安全(100)
https://qiita.com/kaizen_nagoya/items/e2df642e3951e35e6a53
<この記事は個人の過去の経験に基づく個人の感想です。現在所属する組織、業務とは関係がありません。>
This article is an individual impression based on the individual's experience. It has nothing to do with the organization or business to which I currently belong.

文書履歴(document history)

ver. 0.10 初稿 20231001

最後までおよみいただきありがとうございました。

いいね 💚、フォローをお願いします。

Thank you very much for reading to the last sentence.

Please press the like icon 💚 and follow me for your happy life.

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?