Help us understand the problem. What is going on with this article?

ONNXでMLPを作る

Python APIを使ってMLPのONNXファイルを作れるようになりました。

onnx.checker.check_modelは次元の情報をチェックしていないようで、適当にshapeの値を変更してもcheckがOKになります。だいぶ怪しいのですが、とりあえずそれっぽいのができました。

今までのまとめ

初めてのONNX
https://qiita.com/natsutan/items/6670bd518005152f654a

ONNX Nodes in a graph must be topologically sorted対策
https://qiita.com/natsutan/items/9233a65db4cc90fb4c3a

ONNXのValueInfoProtoにinitializerで値を設定する
https://qiita.com/natsutan/items/e77e977e26c75c9159e4

ソース

node間のつなぎが、名前(文字列)のみというのにだいぶ驚きました。
同僚がshapeの推論が・・・と言っていた意味がようやくわかりました。

import numpy

import onnx
from onnx import TensorProto
from onnx import helper, numpy_helper

X = helper.make_tensor_value_info('X', TensorProto.FLOAT, [28, 28])
Y = helper.make_tensor_value_info('Y', TensorProto.FLOAT, [10])
input_shape = helper.make_tensor_value_info('input_shape', TensorProto.INT64, [1])

# numpy array から TensorProtoを作る
input_shape_init = numpy_helper.from_array(numpy.array([784], dtype=numpy.int64))
input_shape_init.name = 'input_shape'

W0 = numpy_helper.from_array(numpy.zeros((784, 512), dtype=numpy.float))
B0 = numpy_helper.from_array(numpy.zeros(512, dtype=numpy.float))
W0.name = 'W0'
B0.name = 'B0'

W1 = numpy_helper.from_array(numpy.zeros((512, 512), dtype=numpy.float))
B1 = numpy_helper.from_array(numpy.zeros(512, dtype=numpy.float))
W1.name = 'W1'
B1.name = 'B1'

W2 = numpy_helper.from_array(numpy.zeros((512, 10), dtype=numpy.float))
B2 = numpy_helper.from_array(numpy.zeros(10, dtype=numpy.float))
W2.name = 'W2'
B2.name = 'B2'


flat_op = helper.make_node(
    'Reshape',
    inputs=['X', 'input_shape'],
    outputs=['reshape_out'],
    name="reshape_node"
)

matmul0_op = helper.make_node(
    'MatMul',
    inputs = ['reshape_out', 'W0'],
    outputs = ['matmul0_out']
)

add0_op = helper.make_node(
    'Add',
    inputs = ['matmul0_out', 'B0'],
    outputs= ['add0_out']
)

relu0_op = helper.make_node(
    'Relu',
    inputs = ['add0_out'],
    outputs = ['relu0_out']
)

matmul1_op = helper.make_node(
    'MatMul',
    inputs = ['relu0_out', 'W1'],
    outputs = ['matmul1_out']
)

add1_op = helper.make_node(
    'Add',
    inputs = ['matmul1_out', 'B1'],
    outputs= ['add1_out']
)

relu1_op = helper.make_node(
    'Relu',
    inputs = ['add1_out'],
    outputs = ['relu1_out']
)

matmul2_op = helper.make_node(
    'MatMul',
    inputs = ['relu1_out', 'W2'],
    outputs = ['matmul2_out']
)

add2_op = helper.make_node(
    'Add',
    inputs = ['matmul2_out', 'B2'],
    outputs= ['add2_out']
)

softmax_op = helper.make_node(
    'Softmax',
    axis = -1,
    inputs=['add2_out'],
    outputs = ['Y']
)

graph_def = helper.make_graph(
    [flat_op, matmul0_op, add0_op, relu0_op, matmul1_op, add1_op, relu1_op, matmul2_op, add2_op,
     softmax_op],
    'my-mlp',
    [X],
    [Y],
    initializer = [input_shape_init, W0, B0, W1, B1, W2, B2]
)

model_def = helper.make_model(
    graph_def,
    producer_name='natsutan'
)

onnx.save(model_def, 'onnx/my_mlp.onnx')
onnx.checker.check_model(model_def)
print('The model is checked!')

可視化

netronの出力

my_mlp.onnx.png

Why do not you register as a user and use Qiita more conveniently?
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away
Comments
Sign up for free and join this conversation.
If you already have a Qiita account
Why do not you register as a user and use Qiita more conveniently?
You need to log in to use this function. Qiita can be used more conveniently after logging in.
You seem to be reading articles frequently this month. Qiita can be used more conveniently after logging in.
  1. We will deliver articles that match you
    By following users and tags, you can catch up information on technical fields that you are interested in as a whole
  2. you can read useful information later efficiently
    By "stocking" the articles you like, you can search right away