6
7

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

Chainerに入門、And/Or/Xorの実装

Last updated at Posted at 2016-07-11

はじめに

Chainerのチュートリアルでは、MNISTの実装例が紹介されています。

Introduction to Chainer / Chainer

'hello world'相当ということですが、もっと単純な例で動きを体感するため、論理演算の実装を紹介してくれている記事があります。

chainerでニューラルネットを学んでみるよ(chainerでニューラルネット2) / 人工言語処理入門

ただ、この例では以前のChainerでの実装方法のようなので、勉強を兼ねてChainerチュートリアルに近い形で書いてみました。

1つのLinkで

And/Orは正しく動作するものの、Link1つでは表現できないXorで失敗することを確認するものです。

2つの入力を、2つの出力で表現したネットワークの定義。出力は「Falseか」「Trueか」をれぞれを実数で表す分類器になっています。

## Network definition
class NN2x2_1Link(Chain):
    def __init__(self):
        super(NN2x2_1Link, self).__init__(
            l = F.Linear(2, 2),
        )
    def __call__(self, x):
        h = self.l(x)
        return h

学習データは、真理値表の4つの定義をそのまま使うとして、これではニューラルネットワークで学習するには少なすぎます。
そこで、単純にこれを繰り返し augment_size 回繰り返し与えることを1回のepochとしています。

## Runs learning loop
def learning_looper(model, optimizer, inputs, outputs, epoch_size):
    augment_size = 100
    for epoch in range(epoch_size):
        print('epoch %d' % epoch)
        for a in range(augment_size):
            for i in range(len(inputs)):
                x = Variable(inputs[i].reshape(1,2).astype(np.float32), volatile=False)
                t = Variable(outputs[i].astype(np.int32), volatile=False)
                optimizer.update(model, x, t)
        summarize(model, optimizer, inputs, outputs)

学習結果は…

And/Orは正しく学んでくれました。

<<AND: After Learning>>
  :
epoch 4
model says:
  0 & 0 = 0 (zero:1.355372 one:-1.355372)
  0 & 1 = 0 (zero:0.276090 one:-0.623873)
  1 & 0 = 0 (zero:0.292820 one:-1.014451)
  1 & 1 = 1 (zero:-0.786463 one:-0.282952)

<<OR: After Learning>>
  :
epoch 4
model says:
  0 & 0 = 0 (zero:0.210631 one:-0.210631)
  0 & 1 = 1 (zero:-0.965156 one:0.735971)
  1 & 0 = 1 (zero:-0.337314 one:1.656959)
  1 & 1 = 1 (zero:-1.513100 one:2.603561)

しかし ※想定通りに※、Xorは学べていません。

<<XOR: Before learning>>
  :
epoch 19
model says:
  0 & 0 = 1 (zero:-0.005195 one:0.005195)
  0 & 1 = 1 (zero:-0.365441 one:-0.365370)
  1 & 0 = 0 (zero:0.418680 one:0.408669)
  1 & 1 = 0 (zero:0.058434 one:0.038104)

そこで、Linkを2つにすると全てうまくいくはずです。

2つのLinkで

And/Orは割愛しますが、Xorも無事に学べました。

しかし、2つのLinkにすることで、収束するまでに相当時間がかかっています。
And/Orで4倍にepochを増やし、Xorは200回で安定的に学習が終了する様子でした。

<<XOR: Before learning>>
  :
epoch 199
model says:
  0 & 0 = 0 (zero:2.194428 one:-2.195220)
  0 & 1 = 1 (zero:-2.352996 one:2.251660)
  1 & 0 = 1 (zero:-2.347063 one:2.247520)
  1 & 1 = 0 (zero:1.743878 one:-2.734182)

まとめ

ここまでやれば、流石に腑落ちします(苦笑)。
2つLinkを使う例では、シグモイド関数を使ってニューラル・ネットワークらしさがちょっと出ています。
Chainerを使うことでとても簡単にニューラル・ネットワークを実装できますが、それだけにニューラル・ネットワークの常識が前提だなと感じました。馴染みのないソフトウェアエンジニアが学習のためチュートリアルにトライすると、APIの名前や概念で分からないところが出ると思いますが、そのとき深層学習 (機械学習プロフェッショナルシリーズ)など参照しながら進めると、学ぶ効率が良さそうです。

コード

1つのLink

GIST

# And/Or/Xor classifier network example
#
# This is re-written version of:
#   http://hi-king.hatenablog.com/entry/2015/06/27/194630
# By following chainer introduction:
#   http://docs.chainer.org/en/stable/tutorial/basic.html

## Chainer cliche
import numpy as np
import chainer
from chainer import cuda, Function, gradient_check, Variable, optimizers, serializers, utils
from chainer import Link, Chain, ChainList
import chainer.functions as F
import chainer.links as L

# Neural Network

## Network definition
class NN2x2x1dim(Chain):
    def __init__(self):
        super(NN2x2x1dim, self).__init__(
            l = F.Linear(2, 2),
        )
    def __call__(self, x):
        h = self.l(x)
        return h

# Sub routine

## Utility: Summarize current results
def summarize(model, optimizer, inputs, outputs):
    sum_loss, sum_accuracy = 0, 0
    print 'model says:'
    for i in range(len(inputs)):
        x  = Variable(inputs[i].reshape(1,2).astype(np.float32), volatile=False)
        t  = Variable(outputs[i].astype(np.int32))
        y = model.predictor(x)
        loss = model(x, t)
        sum_loss += loss.data
        sum_accuracy += model.accuracy.data
        print('  %d & %d = %d (zero:%f one:%f)' % (x.data[0,0], x.data[0,1], np.argmax(y.data), y.data[0,0], y.data[0,1]))
    #mean_loss = sum_loss / len(inputs)
    #mean_accuracy = sum_accuracy / len(inputs)
    #print sum_loss, sum_accuracy, mean_loss, mean_accuracy

## Runs learning loop
def learning_looper(model, optimizer, inputs, outputs, epoch_size):
    augment_size = 100
    for epoch in range(epoch_size):
        print('epoch %d' % epoch)
        for a in range(augment_size):
            for i in range(len(inputs)):
                x = Variable(inputs[i].reshape(1,2).astype(np.float32), volatile=False)
                t = Variable(outputs[i].astype(np.int32), volatile=False)
                optimizer.update(model, x, t)
        summarize(model, optimizer, inputs, outputs)

# Main

## Test data
inputs = np.array([[0., 0.], [0., 1.], [1., 0.], [1., 1.]], dtype=np.float32)
and_outputs = np.array([[0], [0], [0], [1]], dtype=np.int32)
or_outputs = np.array([[0], [1], [1], [1]], dtype=np.int32)
xor_outputs = np.array([[0], [1], [1], [0]], dtype=np.int32)

## AND Test --> will learn successfully
## Model & Optimizer instance
and_model = L.Classifier(NN2x2x1dim())
optimizer = optimizers.SGD()
# quicker) optimizer = optimizers.MomentumSGD(lr=0.01, momentum=0.9)
optimizer.setup(and_model)
print '<<AND: Before learning>>'
summarize(and_model, optimizer, inputs, and_outputs)
print '\n<<AND: After Learning>>'
learning_looper(and_model, optimizer, inputs, and_outputs, epoch_size = 5)

## OR Test --> will learn successfully
## Model & Optimizer instance
or_model = L.Classifier(NN2x2x1dim())
optimizer = optimizers.SGD()
optimizer.setup(or_model)
print '---------\n\n<<OR: Before learning>>'
summarize(or_model, optimizer, inputs, or_outputs)
print '\n<<OR: After Learning>>'
learning_looper(or_model, optimizer, inputs, or_outputs, epoch_size = 5)

## XOR Test --> will FAIL, single link is not enough for XOR
## Model & Optimizer instance
xor_model = L.Classifier(NN2x2x1dim())
optimizer = optimizers.SGD()
optimizer.setup(xor_model)
print '---------\n\n<<XOR: Before learning>>'
summarize(xor_model, optimizer, inputs, xor_outputs)
print '\n<<XOR: After Learning>>'
learning_looper(xor_model, optimizer, inputs, xor_outputs, epoch_size = 20)

2つのLink

GIST

# Chainer training: And/Or/Xor classifier network example with 2 links.
#
# This is re-written version of:
#   http://hi-king.hatenablog.com/entry/2015/06/27/194630
# By following chainer introduction:
#   http://docs.chainer.org/en/stable/tutorial/basic.html

## Chainer cliche
import numpy as np
import chainer
from chainer import cuda, Function, gradient_check, Variable, optimizers, serializers, utils
from chainer import Link, Chain, ChainList
import chainer.functions as F
import chainer.links as L

# Neural Network

## Network definition
class NN2x2_2links(Chain):
    def __init__(self):
        super(NN2x2_2links, self).__init__(
            l1 = F.Linear(2, 2),
            l2 = F.Linear(2, 2),
        )
    def __call__(self, x):
        h = self.l2(F.sigmoid(self.l1(x)))
        return h

# Sub routine

## Utility: Summarize current results
def summarize(model, optimizer, inputs, outputs):
    sum_loss, sum_accuracy = 0, 0
    print 'model says:'
    for i in range(len(inputs)):
        x  = Variable(inputs[i].reshape(1,2).astype(np.float32), volatile=False)
        t  = Variable(outputs[i].astype(np.int32))
        y = model.predictor(x)
        loss = model(x, t)
        sum_loss += loss.data
        sum_accuracy += model.accuracy.data
        print('  %d & %d = %d (zero:%f one:%f)' % (x.data[0,0], x.data[0,1], np.argmax(y.data), y.data[0,0], y.data[0,1]))
    #mean_loss = sum_loss / len(inputs)
    #mean_accuracy = sum_accuracy / len(inputs)
    #print sum_loss, sum_accuracy, mean_loss, mean_accuracy

## Runs learning loop
def learning_looper(model, optimizer, inputs, outputs, epoch_size):
    augment_size = 100
    for epoch in range(epoch_size):
        print('epoch %d' % epoch)
        for a in range(augment_size):
            for i in range(len(inputs)):
                x = Variable(inputs[i].reshape(1,2).astype(np.float32), volatile=False)
                t = Variable(outputs[i].astype(np.int32), volatile=False)
                optimizer.update(model, x, t)
        summarize(model, optimizer, inputs, outputs)

# Main

## Test data
inputs = np.array([[0., 0.], [0., 1.], [1., 0.], [1., 1.]], dtype=np.float32)
and_outputs = np.array([[0], [0], [0], [1]], dtype=np.int32)
or_outputs = np.array([[0], [1], [1], [1]], dtype=np.int32)
xor_outputs = np.array([[0], [1], [1], [0]], dtype=np.int32)

## AND Test --> will learn successfully
and_model = L.Classifier(NN2x2_2links())
optimizer = optimizers.SGD()
# do it quicker) optimizer = optimizers.MomentumSGD(lr=0.01, momentum=0.9)
optimizer.setup(and_model)
print '<<AND: Before learning>>'
summarize(and_model, optimizer, inputs, and_outputs)
print '\n<<AND: After Learning>>'
learning_looper(and_model, optimizer, inputs, and_outputs, epoch_size = 20)

## OR Test --> will learn successfully
or_model = L.Classifier(NN2x2_2links())
optimizer = optimizers.SGD()
optimizer.setup(or_model)
print '---------\n\n<<OR: Before learning>>'
summarize(or_model, optimizer, inputs, or_outputs)
print '\n<<OR: After Learning>>'
learning_looper(or_model, optimizer, inputs, or_outputs, epoch_size = 20)

## XOR Test --> will learn successfully
xor_model = L.Classifier(NN2x2_2links())
optimizer = optimizers.SGD()
optimizer.setup(xor_model)
print '---------\n\n<<XOR: Before learning>>'
summarize(xor_model, optimizer, inputs, xor_outputs)
print '\n<<XOR: After Learning>>'
learning_looper(xor_model, optimizer, inputs, xor_outputs, epoch_size = 200)
6
7
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
6
7

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?