47
43

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

scipy.optimize.minimizeの使い方

Posted at

SciPyリファレンス scipy.optimize 日本語訳にいろいろな最適化の関数が書いてあったので、いくつか試してみた。
y = c + a*(x - b)**2の2次関数にガウスノイズを乗せて、これを2次関数で最適化してパラメータ求めてみた。

import numpy as np
from matplotlib import pyplot as plt
import scipy.optimize as optimize
%matplotlib inline 

この後で決定係数のR2や、平均2乗誤差、平均絶対誤差を計算するためにScikit-learnもインポートする。

from sklearn.metrics import r2_score
from sklearn.metrics import mean_squared_error
from sklearn.metrics import mean_absolute_error

モデルになる2次関数の式

def base_func(x,a,b,c):
    y = c + a*(x - b)**2
    return y

パラメータを入れて計算して、グラフ表示する。

x = np.arange(-30, 30, 1)
para = [2.0,5.0,10.0]
np.random.seed(seed=10)
y = base_func(x,para[0],para[1],para[2])+np.random.normal(0, 60, len(x))

plt.scatter(x , y)
plt.show()

output_4_0.png

最適化したグラフを書くための関数を用意。

def plot_fig(x,y,para):
    fit = base_func(x,para[0],para[1],para[2])
    r2 = r2_score(y, fit)
    print()
    print('R2:',r2)
    print('Results:', para)
    plt.scatter(x,y,label='data')
    plt.plot(x,fit,'r',label='fit')
    plt.legend()
    plt.show()

最小二乗法 (RMSEを最小化する)

Python SciPy : 非線形最小二乗問題の最適化アルゴリズム

# カーブフィッティング
cf = optimize.curve_fit(f=base_func, xdata=x, ydata=y, p0=para)
print(cf)
print(cf[0])
plot_fig(x,y,cf[0])

(array([ 2.01296897,  4.92997267, 16.69809461]), array([[ 6.97990322e-04, -1.88282482e-03, -1.88758970e-01],
[-1.88282482e-03,  1.54028062e-02,  2.83488209e-01],
[-1.88758970e-01,  2.83488209e-01,  1.06165700e+02]]))
[ 2.01296897  4.92997267 16.69809461]

R2: 0.9934619969757706
Results: [ 2.01296897 4.92997267 16.69809461]

output_7_1.png

非線形最小二乗法

残差関数を定義する必要がある。

# residual function
def ls_res(param,x,y):
    residual = y - base_func(x,param[0],param[1],param[2])
    return residual
# レガシー関数 leastsq(func, x0[, args, Dfun, full_output, …]) 方程式セットの平方和を最小化する。
le_lsq=optimize.leastsq(ls_res, para, args=(x, y))
print(le_lsq)
print(le_lsq[0])
plot_fig(x,y,le_lsq[0])

(array([ 2.01296897,  4.92997267, 16.69809461]), 1)
[ 2.01296897  4.92997267 16.69809461]

R2: 0.9934619969757706
Results: [ 2.01296897 4.92997267 16.69809461]

output_10_1.png

# 推奨 least_squares(fun, x0[, jac, bounds, …]) 変数境界付きの非線形最小二乗法問題を解く。
lsq=optimize.leastsq(ls_res, para, args=(x, y))
print(lsq)
print(lsq[0])
plot_fig(x,y,lsq[0])
(array([ 2.01296897,  4.92997267, 16.69809461]), 1)
[ 2.01296897  4.92997267 16.69809461]

R2: 0.9934619969757706
Results: [ 2.01296897 4.92997267 16.69809461]
output_11_1.png

最適化

最小化する関数を定義する
Root Mean Squared Error (RMSE)
$$RMSE = \sqrt{ \frac{ \sum_{i} (y_{\mathrm{obs}, i} - y_{\mathrm{pred}, i})^2 }{ n }}$$

# RMSEの関数を定義
def rmse_res(param,x,y):
    residual = y - base_func(x,param[0],param[1],param[2])
    resid_=np.sqrt(((residual*residual).sum())/len(x))
    return resid_
#  参考:Scikitlearnの関数を使った場合。rmse_resと同じ結果になる
# def rmse_res2(param,x,y):
#    fit = base_func(x,param[0],param[1],param[2])
#    rmse = np.sqrt(mean_squared_error(y, fit))
#    return rmse
# 局所(多変数)最適化 minimize(fun, x0[, args, method, jac, hess, …]) 1つ以上の変数のスカラー関数の最小化
mi_rmse=optimize.minimize(rmse_res, para, args=(x, y),method='Nelder-Mead')
print(mi_rmse)

#  結果の出力に注意
print(mi_rmse.x)
plot_fig(x,y,mi_rmse.x)
     final_simplex: (array([[ 2.01296895,  4.92997307, 16.69812532],
           [ 2.01296886,  4.92997279, 16.69816863],
           [ 2.01296906,  4.92997222, 16.6981314 ],
           [ 2.01296907,  4.92997182, 16.69803887]]), array([53.48433856, 53.48433856, 53.48433856, 53.48433856]))
               fun: 53.48433855800152
           message: 'Optimization terminated successfully.'
              nfev: 173
               nit: 93
            status: 0
           success: True
                 x: array([ 2.01296895,  4.92997307, 16.69812532])
    [ 2.01296895  4.92997307 16.69812532]

R2: 0.9934619969757669
Results: [ 2.01296895 4.92997307 16.69812532]

output_16_1.png

#  fmin(func, x0[, args, xtol, ftol, maxiter, …]) レガシー関数 非推奨
fmin_rmse=optimize.fmin(rmse_res, para, args=(x, y))
print(fmin_rmse)
plot_fig(x,y,fmin_rmse)
    Optimization terminated successfully.
             Current function value: 53.484339
             Iterations: 93
             Function evaluations: 173
    [ 2.01296895  4.92997307 16.69812532]

R2: 0.9934619969757669
Results: [ 2.01296895 4.92997307 16.69812532]

output_17_1.png

大域最適化

# basinhopping(func, x0[, niter, T, stepsize, …]) basin-hoppingアルゴリズムを使って関数の大域最小値を求める。
bh_rmse=optimize.basinhopping(rmse_res, para, minimizer_kwargs={"method": "L-BFGS-B","args":(x, y)})
print(bh_rmse)
print(bh_rmse.x)
plot_fig(x,y,bh_rmse.x)
     fun: 53.48433855798703
     lowest_optimization_result:       fun: 53.48433855798703
     hess_inv: <3x3 LbfgsInvHessProduct with dtype=float64>
          jac: array([9.66338121e-05, 5.68434189e-06, 0.00000000e+00])
      message: b'CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH'
         nfev: 120
          nit: 24
       status: 0
      success: True
            x: array([ 2.012969  ,  4.92997261, 16.6980913 ])
                        message: ['requested number of basinhopping iterations completed successfully']
          minimization_failures: 0
                           nfev: 6564
                            nit: 100
                              x: array([ 2.012969  ,  4.92997261, 16.6980913 ])
    [ 2.012969    4.92997261 16.6980913 ]

R2: 0.9934619969757704
Results: [ 2.012969 4.92997261 16.6980913 ]

output_19_1.png

# differential_evolution(func, bounds[, args, …]) 多変数関数の大域最小値を求める。
bounds = [(-0,10), (-10,10), (-50,50)]
de_rmse=optimize.differential_evolution(rmse_res, bounds, args=(x, y))
print(de_rmse)
print(de_rmse.x)
plot_fig(x,y,de_rmse.x)

         fun: 53.48433860476771
         jac: array([-1.77564630e-03, -5.75539616e-04, -2.98427949e-05])
     message: 'Optimization terminated successfully.'
        nfev: 935
         nit: 18
     success: True
           x: array([ 2.01297481,  4.92995775, 16.6948975 ])
    [ 2.01297481  4.92995775 16.6948975 ]

R2: 0.9934619969643333
Results: [ 2.01297481 4.92995775 16.6948975 ]

output_20_1.png

# brute(func, ranges[, args, Ns, full_output, …]) 与えられた範囲でブルートフォース法によって関数を最小化する。
br_rmse=optimize.brute(rmse_res, ranges=bounds, args=(x, y))
print(br_rmse)
plot_fig(x,y,br_rmse)
    [ 2.01296928  4.92997168 16.69807946]

R2: 0.9934619969757458
Results: [ 2.01296928 4.92997168 16.69807946]

output_21_1.png

最小化する関数を絶対誤差に変更

Mean Absolute Error (MAE)
$$ MAE = \frac{ \sum_{i} | y_{\mathrm{obs}, i} - y_{\mathrm{pred}, i} | }{ n }$$

# MAEの関数を定義
def mae_res(param,x,y):
    residual = np.abs(y - base_func(x,param[0],param[1],param[2]))
    resid_=((residual).sum())/len(x)
    return resid_
# 参考:Scikitlearnの関数を使った場合。mae_resと同じ結果になる
# def mae_res2(param,x,y):
#    fit = base_func(x,param[0],param[1],param[2])
#    mae = mean_absolute_error(y,fit)
#    return  mae
mi_mae=optimize.minimize(mae_res, para, args=(x, y),method='Nelder-Mead')
print(mi_mae)
print(mi_mae.x)
plot_fig(x,y,mi_mae.x)

     final_simplex: (array([[ 2.00441621,  5.08263527, 12.59609788],
           [ 2.00441602,  5.0826368 , 12.59609761],
           [ 2.00441614,  5.08263639, 12.59604168],
           [ 2.00441602,  5.0826364 , 12.5961183 ]]), array([41.23285905, 41.23285928, 41.23285957, 41.2328597 ]))
               fun: 41.23285904630758
           message: 'Optimization terminated successfully.'
              nfev: 197
               nit: 108
            status: 0
           success: True
                 x: array([ 2.00441621,  5.08263527, 12.59609788])
    [ 2.00441621  5.08263527 12.59609788]

R2: 0.9932364583092926
Results: [ 2.00441621 5.08263527 12.59609788]

output_25_1.png

fmin_mae=optimize.fmin(mae_res, para, args=(x, y))
print(fmin_mae)
plot_fig(x,y,fmin_mae)
    Optimization terminated successfully.
             Current function value: 41.232859
             Iterations: 108
             Function evaluations: 197
    [ 2.00441621  5.08263527 12.59609788]

R2: 0.9932364583092926
Results: [ 2.00441621 5.08263527 12.59609788]

output_26_1.png

# differential_evolution(func, bounds[, args, …]) 多変数関数の大域最小値を求める。
de_mae=optimize.differential_evolution(mae_res, bounds, args=(x, y))
print(de_mae)
print(de_mae.x)
plot_fig(x,y,de_mae.x)

         fun: 41.22302370179955
         jac: array([ 8.8972655 , -2.14848157, -0.03333298])
     message: 'Optimization terminated successfully.'
        nfev: 730
         nit: 13
     success: True
           x: array([ 2.0129207 ,  5.01018698, 13.87485399])
    [ 2.0129207   5.01018698 13.87485399]

R2: 0.993388114525508
Results: [ 2.0129207 5.01018698 13.87485399]

output_27_1.png

47
43
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
47
43

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?