LoginSignup
2
1

More than 3 years have passed since last update.

openframeworksでlibtorchを使うための設定(Windows / Mac)

Last updated at Posted at 2019-11-11

あとでちゃんと書く

LibTorchをopenframeworksで使うための設定
https://pytorch.org/tutorials/advanced/cpp_export.html
このチュートリアルをopenframeworks上で行う

環境

Win(on GPU)
visual studio 2015
openframeworks: 0.10.1
cuda: 10.1
cudnn: 7.6.4

Mac(on CPU)
xcode: 11.2
openframeworks: 0.10.1

Pytorchでの前準備

torchvisionのMobileNetV2とResNet50をTorchScriptに変換する

convert_torchvision.py
import torch
import torchvision

model1 = torchvision.models.resnet50(pretrained=True)
model1.eval()
print(model1)
traced1 = torch.jit.trace(model1, (torch.rand(1, 3, 224, 224)))
traced1.save('./resnet50.pt')

model2 = torchvision.models.mobilenet_v2(pretrained=True)
model2.eval()
print(model2)
traced2 = torch.jit.trace(model2, (torch.rand(1, 3, 224, 224)))
traced2.save('./mobilenetv2.pt')

openframeworksのプロジェクト設定

projectGeneratorで生成したプロジェクトの設定を変更していく

Win

まとめたプロパティシートはこちら
表示 -> その他のウィンドウ -> プロパティシートでこのプロパティシートを追加すれば(LibTorch + GPU)設定完了

ユーザーマクロ
LIBTORCH_ROOT = C:\Users\UserName\Documents\libtorch
UserNameは適宜変更

C/C++ -> 追加のインクルードディレクトリ
$(LIBTORCH_ROOT)\Release\libtorch\include
$(LIBTORCH_ROOT)\Release\libtorch\include\torch\csrc\api\include

リンカー -> 入力 -> 追加の依存ファイル
$(LIBTORCH_ROOT)\Release\libtorch\lib\c10.lib
$(LIBTORCH_ROOT)\Release\libtorch\lib\caffe2_nvrtc.lib
$(LIBTORCH_ROOT)\Release\libtorch\lib\c10_cuda.lib
$(LIBTORCH_ROOT)\Release\libtorch\lib\torch.lib;

libtorch\libにある*.dllをproject\binに移動

Mac

target
Architectures ->Architectures
$(ARCH_STARNDARD_64_BIT)

Project.xcconfig
LIBTORCH_ROOT = /Users/UserName/Documents/libtorch
UserNameは適宜変更

Linking -> Other Linker Flags
-L/usr/local/opt/opencv@3/lib
$(LIBTORCH_ROOT)/lib/libc10.dylib
$(LIBTORCH_ROOT)/lib/libtorch.dylib

Linking -> Runpath Search Paths, Library Search Path
$(LIBTORCH_ROOT)/lib

Search Paths -> System Header Search Paths
$(LIBTORCH_ROOT)/include
$(LIBTORCH_ROOT)/include/torch/csrc/api/include

ofAppのsetupを書き換える

ofApp.cpp
#include <torch/script.h>

void ofApp::setup(){
    torch::jit::script::Module module;
    ofFilePath file;
    std::cout << std::filesystem::exists(file.getAbsolutePath("./mobilenetv2.pt"));
    ofLogNotice() << file.getCurrentWorkingDirectory();
    try {
      // Deserialize the ScriptModule from a file using torch::jit::load().
      module = torch::jit::load(file.getAbsolutePath("./mobilenetv2.pt"));
    }
    catch (const c10::Error& e) {
      std::cerr << "error loading the model\n";
      return -1;
    }

    std::cout << "ok\n";

    // Create a vector of inputs.
    std::vector<torch::jit::IValue> inputs;
    inputs.push_back(torch::ones({1, 3, 224, 224}));

    // Execute the model and turn its output into a tensor.
    at::Tensor output = module.forward(inputs).toTensor();
    std::cout << output.slice(/*dim=*/1, /*start=*/0, /*end=*/5) << '\n';
}
2
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
1