LoginSignup
2
2

More than 3 years have passed since last update.

敢えてRubyで学ぶ「ゼロから作るDeep Learning」基本関数群

Last updated at Posted at 2020-02-17

以下の関数群をRubyに変換してみる。
これを持って、次回以降、ニューラルネットを実装していく。

require 'numo/narray'

def identity_function(x)
    return x
end

def step_function(x)
  grad = Numo::Uint32.zeros(x.shape)
  grad[x>=0] = 1
  grad
end

def sigmoid(x)
  1 / (1 + Numo::DFloat::Math.exp(-x))
end

def sigmoid_grad(x)
  (1.0 - sigmoid(x)) * sigmoid(x)
end

def relu(x)
  copy = x.copy
  copy[x < 0] = 0
  copy
end

def relu_grad(x)
  grad = Numo::DFloat.zeros(x.shape)
  grad[x>=0] = 1
  grad
end

def softmax(x)
  if x.ndim == 2
    x = x.transpose
    x = x - x.max(0)
    y = Numo::DFloat::Math.exp(x) / Numo::DFloat::Math.exp(x).sum(0)
    return y.transpose
  end

  c = x - x.max
  exp_x = Numo::DFloat::Math.exp(x - c)
  sum_exp_x = exp_x.sum
  exp_x / sum_exp_x
end

def sum_squared_error(y, t)
  0.5 * ((y-t)**2).sum
end

def cross_entropy_error(y, t)
  if y.ndim == 1
    t = t.reshape(1, t.size)
    y = y.reshape(1, y.size)
  end
  if t.size == y.size
    t = t.argmax
  end

  delta = 1e-7
  batch_size = y.shape[0]
  seq = Numo::UInt32.new(batch_size).seq(0,1)
  -(Numo::NMath.log(y[seq,true][true,t] + delta)).sum / batch_size
end

def softmax_loss(x, t)
  y = softmax(x)
  cross_entropy_error(y, t)
end
2
2
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
2
2