2
3

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

RasPi OneでTensorflow/Kerasを動かす。

Posted at

最近使ってなかったRasPi OneにTensorflowとKerasを入れてみた。
モデル: Raspbeery Pi 1 MODEL B+
OS: Linux raspberrypi 4.9.59+ #1047 Sun Oct 29 11:47:10 GMT 2017 armv6l GNU/Linux (NOOBS_v2_4_4.zip)
Python: 2.7.13(Systemに標準でインストール済みのもの)

#RasPi再インストール
2年近く前に使ってそのままだったので、OSから入れ替え
The initial Raspberry Pi setup without monitorを参考に、ディスプレーとかキーボード無しでUSB-Serial変換ケーブル(とノートPC)でインストール。USB-Serial変換ケーブルはこれを買った。
注1) NOOBS_v2_4_4.zipでは、/mnt/os/Raspbian内にflavours.jsonというのは無いので、何もしなくて良い。
注2) /mnt/recovery.cmdlineの内容が違うが、とにかくruninstallerとsilentinstallを最初と最後につけとけば良い。
注3) wpasupplicantやwireless-toolsは入っているので、下記を見て設定
https://www.raspberrypi.org/documentation/configuration/wireless/wireless-cli.md

#Tensorflowインストール

RasPiでKeras/TensorFlowを動かすを参考に入れ始めたが、環境の違い(今時RasPi Oneは非力だ)からかところどころ引っかかったので、メモとして残しておく。

上記記事の通り、普通に入れるとコンパイルが終わらないので、Cross-compiling TensorFlow for the Raspberry Piを参考に

sudo apt-get install libblas-dev liblapack-dev python-dev \
libatlas-base-dev gfortran python-setuptools
​sudo pip install \
http://ci.tensorflow.org/view/Nightly/job/nightly-pi-zero/lastSuccessfulBuild/artifact/output-artifacts/tensorflow-1.4.0-cp27-none-any.whl

10分くらいですかね。
注) インストールファイル名が変わっている場合があるので、 http://ci.tensorflow.org/view/Nightly/job/nightly-pi-zero/lastSuccessfulBuild/artifact/output-artifacts/
の最新のファイルを確認して読み替えてください。でないとNot Found urlと怒られます。
pip2なかったのでpipでインストールしてます。

例えば詳解 ディープラーニング ~TensorFlow・Kerasによる時系列データ処理~のロジスティック回帰のサンプルコードを動かすと、いろいろwarningでますが、ちゃんと学習できる。

pi@raspberrypi:~ $ python
Python 2.7.13 (default, Jan 19 2017, 14:48:08) 
[GCC 6.3.0 20170124] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
>>> import numpy as np
>>> tf.set_random_seed(0)
>>> w = tf.Variable(tf.zeros([2, 1]))
2017-11-11 01:51:50.193050: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "ParallelInterleaveDataset" device_type: "CPU"') for unknown op: ParallelInterleaveDataset
2017-11-11 01:51:50.195187: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "DenseToSparseBatchDataset" device_type: "CPU"') for unknown op: DenseToSparseBatchDataset
2017-11-11 01:51:50.197338: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "GroupByWindowDataset" device_type: "CPU"') for unknown op: GroupByWindowDataset
2017-11-11 01:51:50.200289: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "IgnoreErrorsDataset" device_type: "CPU"') for unknown op: IgnoreErrorsDataset
2017-11-11 01:51:50.202929: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "DatasetToSingleElement" device_type: "CPU"') for unknown op: DatasetToSingleElement
2017-11-11 01:51:50.204565: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "SerializeIterator" device_type: "CPU"') for unknown op: SerializeIterator
2017-11-11 01:51:50.205781: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "DeserializeIterator" device_type: "CPU"') for unknown op: DeserializeIterator
2017-11-11 01:51:50.206986: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "MapAndBatchDataset" device_type: "CPU"') for unknown op: MapAndBatchDataset
2017-11-11 01:51:50.208620: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "SqlDataset" device_type: "CPU"') for unknown op: SqlDataset
2017-11-11 01:51:50.215151: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "ScanDataset" device_type: "CPU"') for unknown op: ScanDataset
>>> b = tf.Variable(tf.zeros([1]))
>>> x = tf.placeholder(tf.float32, shape=[None, 2])
>>> t = tf.placeholder(tf.float32, shape=[None, 1])
>>> y = tf.nn.sigmoid(tf.matmul(x, w) + b)
>>> cross_entropy = - tf.reduce_sum(t * tf.log(y) + (1 - t) * tf.log(1 - y))
>>> train_step = tf.train.GradientDescentOptimizer(0.1).minimize(cross_entropy)
>>> correct_prediction = tf.equal(tf.to_float(tf.greater(y, 0.5)), t)
>>> X = np.array([[0, 0], [0, 1], [1, 0], [1, 1]])
>>> Y = np.array([[0], [1], [1], [1]])
>>> init = tf.global_variables_initializer()
>>> sess = tf.Session()
>>> sess.run(init)
>>> for epoch in range(200):
...     sess.run(train_step, feed_dict={
...         x: X,
...         t: Y
...     })
... 
2017-11-11 01:51:58.221900: W tensorflow/core/grappler/utils.cc:48] Node MatMul_fused is not in the graph.
2017-11-11 01:51:58.229164: W tensorflow/core/grappler/utils.cc:48] Node gradients/MatMul_grad/MatMul_1_fused is not in the graph.
2017-11-11 01:51:58.230564: W tensorflow/core/grappler/utils.cc:48] Node gradients/MatMul_grad/MatMul_fused is not in the graph.
>>> classified = correct_prediction.eval(session=sess, feed_dict={
...     x: X,
...     t: Y
... })
2017-11-11 01:52:01.650931: W tensorflow/core/grappler/utils.cc:48] Node MatMul_fused is not in the graph.
>>> prob = y.eval(session=sess, feed_dict={
...     x: X
... })
2017-11-11 01:52:01.757437: W tensorflow/core/grappler/utils.cc:48] Node MatMul_fused is not in the graph.
>>> print('classified:')
classified:
>>> print(classified)
[[ True]
 [ True]
 [ True]
 [ True]]
>>> print()
()
>>> print('output probability:')
output probability:
>>> print(prob)
[[ 0.22355042]
 [ 0.91425949]
 [ 0.91425949]
 [ 0.99747413]]
>>> 

#Kerasインストール
Kerasを入れるのにKeras Documentation/Installation通りに sudo pip install keras とするとscipyのコンパイルが終わらないので、先にh5pyとscipyをapt-getで入れておく。

sudo apt-get install python-h5py
sudo apt-get install python-scipy

これで、数分程度でKerasがインストールできる。

sudo pip install keras

注)下記Keras作者のサンプルを試すなら、Kerasのバージョンを2.0.0でインストールしないと怒られるものがあります。
参考) https://github.com/rcmalli/keras-squeezenet/issues/13

sudo pip install keras==2.0.0

Keras作者のTrained image classification models for Keras中のExamples/Classify imagesを試してみる。

git clone https://github.com/fchollet/deep-learning-models
cd deep-learning-models/

注) 上記はこっちに移設され、git cloneしなくても関数とかKerasのlibraryに取り込まれているみたい。

以下のimg_pathの画像は自分でどこからか持ってきて、カレントディレクトリに置いてください。

pi@raspberrypi:~/deep-learning-models $ python
Python 2.7.13 (default, Jan 19 2017, 14:48:08) 
[GCC 6.3.0 20170124] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import numpy as np
>>> 
>>> from resnet50 import ResNet50
Using TensorFlow backend.
>>> from keras.preprocessing import image
>>> from imagenet_utils import preprocess_input, decode_predictions
>>> 
>>> model = ResNet50(weights='imagenet')
2017-11-11 03:18:06.604538: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "ParallelInterleaveDataset" device_type: "CPU"') for unknown op: ParallelInterleaveDataset
2017-11-11 03:18:06.606494: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "DenseToSparseBatchDataset" device_type: "CPU"') for unknown op: DenseToSparseBatchDataset
2017-11-11 03:18:06.608764: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "GroupByWindowDataset" device_type: "CPU"') for unknown op: GroupByWindowDataset
2017-11-11 03:18:06.611166: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "IgnoreErrorsDataset" device_type: "CPU"') for unknown op: IgnoreErrorsDataset
2017-11-11 03:18:06.613975: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "DatasetToSingleElement" device_type: "CPU"') for unknown op: DatasetToSingleElement
2017-11-11 03:18:06.615980: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "SerializeIterator" device_type: "CPU"') for unknown op: SerializeIterator
2017-11-11 03:18:06.617237: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "DeserializeIterator" device_type: "CPU"') for unknown op: DeserializeIterator
2017-11-11 03:18:06.618451: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "MapAndBatchDataset" device_type: "CPU"') for unknown op: MapAndBatchDataset
2017-11-11 03:18:06.620052: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "SqlDataset" device_type: "CPU"') for unknown op: SqlDataset
2017-11-11 03:18:06.626449: E tensorflow/core/framework/op_kernel.cc:1142] OpKernel ('op: "ScanDataset" device_type: "CPU"') for unknown op: ScanDataset
WARNING:tensorflow:From /usr/local/lib/python2.7/dist-packages/tensorflow/python/ops/nn_impl.py:664: calling reduce_mean (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.
Instructions for updating:
keep_dims is deprecated, use keepdims instead
WARNING:tensorflow:From /usr/local/lib/python2.7/dist-packages/keras/backend/tensorflow_backend.py:1062: calling reduce_prod (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.
Instructions for updating:
keep_dims is deprecated, use keepdims instead
>>> 
>>> img_path = 'African_Bush_Elephant.jpg'
>>> img = image.load_img(img_path, target_size=(224, 224))
>>> x = image.img_to_array(img)
>>> x = np.expand_dims(x, axis=0)
>>> x = preprocess_input(x)
>>> 
>>> preds = model.predict(x)
2017-11-11 03:25:55.292233: W tensorflow/core/grappler/utils.cc:48] Node fc1000/MatMul_fused is not in the graph.
>>> print('Predicted:', decode_predictions(preds))
('Predicted:', [[(u'n02504458', u'African_elephant', 0.91709477), (u'n01871265', u'tusker', 0.041888889), (u'n02504013', u'Indian_elephant', 0.035944905), (u'n03743016', u'megalith', 0.0016836942), (u'n01704323', u'triceratops', 0.0011915577)]])

注1) 上記サイトのサンプルでは import numpy as np が抜けてます。
注2) model = ResNet50(weights='imagenet') 実行時に、Segmentation faultが起きたのですが、RasPiでKeras/TensorFlowを動かすのswap領域の拡張すると治りました。

ここの象は、さすがに9割方アフリカ象と認識されたようだ。
これはpencil boxと認識された。。。

('Predicted:', [[(u'n03908618', u'pencil_box', 0.44221291), (u'n03291819', u'envelope', 0.15529086), (u'n07248320', u'book_jacket', 0.063755065), (u'n03485794', u'handkerchief', 0.053830493), (u'n06596364', u'comic_book', 0.037152626)]])
2
3
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
3

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?