1
2

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

E資格対策_活性化関数

Last updated at Posted at 2022-08-24

ReLU関数

def relu(v):
    x = np.maximum(0, v) 
    return x

Softmax関数


def softmax(x):
    x = x.T
    x = x - np.max(x, axis=0) 
    x = np.exp(x) / np.sum(np.exp(x), axis=0) 
    return x.T

シグモイド関数


def sigmoid(x):
    return 1.0 / (1.0 + np.exp(-x)) 

tanh関数


def tanh(x):
    sinh = (np.exp(x) - np.exp(-x)) / 2.0
    cosh = (np.exp(x) + np.exp(-x)) / 2.0 
    return sinh / cosh

Leaky ReLU関数

alpha は自分で設定する必要がある。


def lrelu(x, alpha=0.01):
    x = np.maximum(alpha*x, x) 
    return x

#別パターン
def lrelu(x, alpha=0.01):
    x = np.where(x > 0.0, x, alpha*x)
    return x 

Layer1 = 活性化関数(Layer0 * 重み1 + バイアス1)

Layer1 = ReLU(Layer0 * w1 + b1)
self.layer1 = relu(np.dot(self.layer0, self.w1) + self.b1)

GAN Generator


class Generator(nn.Module):
    def __init__(self, z_size, feature_size, ngpu):
        #z_size: 一様乱数からサンプリングされるノイズzの次元数。
        #feature_size: 最初の全結合層の次元数。
        #ngpu: 使用できるGPUの数。
        
        super(Generator, self).__init__()
        self.ngpu = ngpu
        self.net = nn.Sequential(
            # Affine変換実施
            nn.Linear(z_size, feature_size), 
            # バッチ正規化の実施 Pytorchのnn.BatchNorm1dを使えばよい
            nn.BatchNorm1d(feature_size), 
            # negative_slope : LeakyReLUの引数 (今回は0.2とする)
            nn.LeakyReLU(0.2),    
          
            #次元数を倍にする
            nn.Linear(feature_size, feature_size * 2),
            nn.BatchNorm1d(feature_size * 2),
            nn.LeakyReLU(0.2),
            
            #次元数を倍にする
            nn.Linear(feature_size * 2, feature_size * 4),
            nn.BatchNorm1d(feature_size * 4),
            nn.LeakyReLU(0.2),
            
            #次元数を倍にする
            nn.Linear(feature_size * 4, 28 * 28),
            nn.Tanh()
          
        )
    
    def forward(self, input):
        x = input.view(-1, 100)
        x = self.net(x)
        x = x.view(-1, 28, 28)
        return x

class Discriminator(nn.Module):
    def __init__(self, feature_size, ngpu):
        super(Discriminator, self).__init__()
        self.ngpu = ngpu
        self.net = nn.Sequential(
# 28*28次元のデータをfeature_size*4次元に変換する全結合層を実装する
            nn.Linear(28*28, feature_size * 4), 
            nn.LeakyReLU(0.2), 
            
            nn.Linear(feature_size * 4, feature_size * 2),
            nn.LeakyReLU(0.2),
            
            nn.Linear(feature_size * 2, feature_size),
            nn.LeakyReLU(0.2),
            
            nn.Linear(feature_size, 1),
            nn.Sigmoid()
        )
    def forward(self, input):
        x = input.view(-1, 28*28)
        x = self.net(x)
        x = x.view(-1)
        return x
1
2
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
2

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?