LoginSignup
45
43

More than 5 years have passed since last update.

chainerをscikit-learn likeに使えるようにした

Last updated at Posted at 2015-07-11

モチベーション

DeepLearningを簡単に書けることで最近話題のChainerだけど、いまいちAPIがまだ洗練されていない気がしたのでscikit-learn likeに使えるようにしてみた。

成果物 -> https://github.com/lucidfrontier45/scikit-chainer , https://pypi.python.org/pypi/scikit-chainer

方針

scikit-learnと同じようにmodel.fit(X, y)model.predict(X), model.score(X, y)のように使えるようにする。

実装

そんなに長くないので全部貼っつける。
まずは基底クラス。sklearn.base.BaseEstimatorに相当します。

from abc import ABCMeta, abstractmethod
from chainer import FunctionSet, Variable, optimizers
from chainer import functions as F
from sklearn import base


class BaseChainerEstimator(base.BaseEstimator, metaclass=ABCMeta):
    def __init__(self, optimizer=optimizers.SGD(), n_iter=10000, eps=1e-5, report=100,
                 **params):
        self.network = self._setup_network(**params)
        self.optimizer = optimizer
        self.optimizer.setup(self.network.collect_parameters())
        self.n_iter = n_iter
        self.eps = eps
        self.report = report

    @abstractmethod
    def _setup_network(self, **params):
        return FunctionSet(l1=F.Linear(1, 1))

    @abstractmethod
    def forward(self, x):
        y = self.network.l1(x)
        return y

    @abstractmethod
    def loss_func(self, y, t):
        return F.mean_squared_error(y, t)

    @abstractmethod
    def output_func(self, h):
        return F.identity(h)

    def fit(self, x_data, y_data):
        score = 1e100
        x = Variable(x_data)
        t = Variable(y_data)
        for i in range(self.n_iter):
            self.optimizer.zero_grads()
            loss = self.loss_func(self.forward(x), t)
            loss.backward()
            self.optimizer.update()
            d_score = score - loss.data
            score = loss.data
            if d_score < self.eps:
                print(i, loss.data, d_score)
                break
            if self.report > 0 and i % self.report == 0:
                print(i, loss.data, d_score)
        return self

    def predict(self, x_data):
        x = Variable(x_data)
        y = self.forward(x)
        return self.output_func(y).data

で、これを継承して回帰と分類の基底クラスを作ります。
基本的にはsklearn.base.RegressionMixinsklearn.base.ClassifierMixinと混ぜるだけ。

class ChainerRegresser(BaseChainerEstimator, base.RegressorMixin):
    pass


class ChainerClassifier(BaseChainerEstimator, base.ClassifierMixin):
    def predict(self, x_data):
        return BaseChainerEstimator.predict(self, x_data).argmax(1)

使用例

LogisticRegressionを試しに実装してみます。

class LogisticRegression(ChainerClassifier):
    def _setup_network(self, **params):
        return FunctionSet(l1=F.Linear(params["n_dim"], params["n_class"]))

    def forward(self, x):
        y = self.network.l1(x)
        return y

    def loss_func(self, y, t):
        return F.softmax_cross_entropy(y, t)

    def output_func(self, h):
        return F.softmax(h)

テストしてみます

import numpy as np
from skchainer import linear
from scipy import special

x = np.linspace(-10, 10, 10000).astype(np.float32)
p = special.expit(x)
y = np.random.binomial(1, p).astype(np.int32)
x = x.reshape(len(x), 1)
model = linear.LogisticRegression(n_dim=1, n_class=2, report=100).fit(x, y)
print(model.score(x, y))

ちゃんと収束して分類できていることが確認できました。

0 3.128652334213257 1e+100
100 0.179422065615654 0.000214472
200 0.16965821385383606 4.20064e-05
300 0.1672072559595108 1.36197e-05
332 0.16683495044708252 9.96888e-06
0.9284

(追伸)

Multi-Layer Perceptronも作ってみました。

class MultiLayerPerceptron(ChainerClassifier):
    def _setup_network(self, **params):
        print(params)
        network = FunctionSet(
            l1=F.Linear(params["input_dim"], params["hidden_dim"]),
            l2=F.Linear(params["hidden_dim"], params["n_classes"])
        )

        return network

    def forward(self, x):
        h = F.relu(self.network.l1(x))
        y = self.network.l2(h)
        return y

    def loss_func(self, y, t):
        return F.softmax_cross_entropy(y, t)

    def output_func(self, h):
        return F.softmax(h)


if __name__ == "__main__":
    import numpy as np
    from sklearn import datasets

    iris = datasets.load_iris()
    x = iris.data.astype(np.float32)
    y = iris.target.astype(np.int32)

    input_dim = x.shape[1]
    n_classes = len(set(y))

    model = MultiLayerPerceptron(optimizer=optimizers.AdaDelta(rho=0.5),
        input_dim=input_dim, hidden_dim=10, n_classes=n_classes, report=100).fit(x, y)
    print(model.score(x, y))
45
43
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
45
43