0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

GPU REST Engine のセットアップと動作確認

Last updated at Posted at 2016-05-09

Chromeで見ると画像の色合いが変なのでSafariやFirefoxで見てください。

全体の流れ

  1. 前提条件
    =================
  1. Download GRE (GPU REST Engine)
    ===================================
Download
git clone https://github.com/NVIDIA/gpu-rest-engine ${HOME}
  1. 推論サーバの構築/起動
    =====================

2.1 推論サーバの構築

buildに15分程度必要です。

ダウンロードしたディレクトリーへ移動
cd ${HOME}/gpu-rest-engine
推論サーバの構築
sudo docker build -t inference_server -f Dockerfile.inference_server .

2.2 推論サーバの起動

推論サーバの起動
sudo nvidia-docker run --name=server --net=host --rm inference_server
result-example
2016/05/06 04:30:51 Initializing Caffe classifiers
2016/05/06 04:30:54 Adding REST endpoint /api/classify
2016/05/06 04:30:54 Starting server listening on :8000
  1. 動作確認
    ==============

ここでの動作確認は推論サーバと同じマシンから、画像をアップロードして、その画像に写っている
動物を推測します。

サンプルイメージ images/1.jpg の動物を推論する

Kobito.G6nOer.png

イメージ画像のPATHを決定
IMAGE="${HOME}/gpu-rest-engine/images/1.jpg"
endpointの指定
ENDPOINT="http://127.0.0.1:8000/api/classify"
変数の確認
cat << ETX

IMAGE_NAME : ${IMAGE} 
END_POINT : ${ENDPOINT}

ETX
推論サーバのAPIをキック
curl -XPOST --data-binary @${IMAGE} ${ENDPOINT} | jq .
result
[
  {
    "label": "n02328150 Angora, Angora rabbit",
    "confidence": 0.9998
  },
  {
    "label": "n02325366 wood rabbit, cottontail, cottontail rabbit",
    "confidence": 0.0001
  },
  {
    "label": "n02326432 hare",
    "confidence": 0.0001
  },
  {
    "label": "n02085936 Maltese dog, Maltese terrier, Maltese",
    "confidence": 0
  },
  {
    "label": "n02342885 hamster",
    "confidence": 0
  }
]

インターネット上の画像を推論する

うさぎ

画像のDownload
wget http://sozaing.com/wp-content/uploads/IMG_29451-540x360.jpg -P ${HOME}/gpu-rest-engine/images
イメージ画像のPATHを決定
IMAGE="${HOME}/gpu-rest-engine/images/IMG_29451-540x360.jpg"
変数の確認
cat << ETX

IMAGE_NAME : ${IMAGE} 
END_POINT : ${ENDPOINT}

ETX
推論サーバのAPIをキック
curl -XPOST --data-binary @${IMAGE} ${ENDPOINT} | jq .
result
[
  {
    "label": "n02325366 wood rabbit, cottontail, cottontail rabbit",
    "confidence": 0.6424
  },
  {
    "label": "n02326432 hare",
    "confidence": 0.1461
  },
  {
    "label": "n02095889 Sealyham terrier, Sealyham",
    "confidence": 0.0929
  },
  {
    "label": "n02437616 llama",
    "confidence": 0.0161
  },
  {
    "label": "n02328150 Angora, Angora rabbit",
    "confidence": 0.0149
  }
]
  1. Benchmarking
    ==========================

4.1 テスト用 docker container の構築

ダウンロードしたディレクトリーへ移動
cd ${HOME}/gpu-rest-engine
command
sudo docker build -t inference_client -f Dockerfile.inference_client .

4.2 テストの実行

benchmark-test
sudo docker run -e CONCURRENCY=8 -e REQUESTS=20000 --net=host inference_client
result-example
Summary:
  Total:	153.6530 secs
  Slowest:	0.0992 secs
  Fastest:	0.0198 secs
  Average:	0.0614 secs
  Requests/sec:	130.1634
  Total data:	6880000 bytes
  Size/request:	344 bytes

Status code distribution:
  [200]	20000 responses

Response time histogram:
  0.020 [1]		|
  0.028 [0]		|
  0.036 [2]		|
  0.044 [5]		|
  0.052 [2]		|
  0.060 [548]	|∎
  0.067 [19139]	|∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎∎
  0.075 [297]	|
  0.083 [3]		|
  0.091 [2]		|
  0.099 [1]		|

Latency distribution:
  10% in 0.0610 secs
  25% in 0.0612 secs
  50% in 0.0614 secs
  75% in 0.0616 secs
  90% in 0.0619 secs
  95% in 0.0623 secs
  99% in 0.0691 secs

このQiitaの情報は、おいらがプライベートで作っているもので、NVIDIAとは無関係です。個人責任でお試しください。インスタンス等の消し忘れ等ご注意ください。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?