LoginSignup
0
0

ヤコビ行列やヘッセ行列と Python

Last updated at Posted at 2024-04-21

はじめに

ヤコビ行列やヘッセ行列を計算してくれるライブラリを紹介します。

今回は以下の関数を議論します。
$$ f(x_0,x_1) = \frac{\log x_0}{x_0^2} + \frac{x_1^2}{37} - \frac{x_1}{7}
,x_0=2,x_1=2$$

SciPy

SciPy にはヤコビ行列を求める関数はありましたが、ヘッセ行列を計算する関数はありませんでした。

scipy.optimize
import numpy as np
from scipy.optimize import approx_fprime

def f(x):
    return np.log(x[0])/(x[0]**2) + (x[1]**2)/37 - x[1]/7

x = np.array([2.0, 2.0])
jac = approx_fprime(x, f)
print(jac)

SymPy

SymPy ではヤコビ行列やヘッセ行列を関数の形で出力できるので検算などには便利です。(今回はしませんが。)

sympy
import sympy as sp

x0, x1 = sp.symbols('x0 x1')
f = sp.log(x0)/(x0**2) + (x1**2)/37 - x1/7

jac_expr = sp.Matrix([f]).jacobian([x0, x1])
jac_func = sp.lambdify((x0, x1), jac_expr)
jac = jac_func(2.0, 2.0)
print(jac)

hess_expr = sp.hessian(f, [x0, x1])
hess_func = sp.lambdify((x0, x1), hess_expr)
hess = hess_func(2.0, 2.0)
print(hess)

PyTorch

jacfwd の代わりに jacrev を使っても構いません。

torch.func
import torch
from torch.func import hessian, jacfwd

def f(x):
    return torch.log(x[0])/(x[0]**2) + (x[1]**2)/37 - x[1]/7

x = torch.tensor([2.0, 2.0])

jac_func = jacfwd(f)
jac = jac_func(x)
print(jac)

hess_func = hessian(f)
hess = hess_func(x)
print(hess)

torch.autograd.functional
import torch
from torch.autograd.functional import hessian, jacobian

def f(x):
    return torch.log(x[0])/(x[0]**2) + (x[1]**2)/37 - x[1]/7

x = torch.tensor([2.0, 2.0])

jac = jacobian(f, x)
print(jac)

hess = hessian(f, x)
print(hess)

AutoGrade

今はあまり開発は活発ではないようです。

autograd
import autograd.numpy as np
from autograd import jacobian, hessian

def f(x):
    return np.log(x[0])/(x[0]**2) + (x[1]**2)/37 - x[1]/7

x = np.array([2.0, 2.0])

jac_func = jacobian(f)
jac = jac_func(x)
print(jac)

hess_func = hessian(f)
hess = hess_func(x)
print(hess)
0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0