14
4

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

nn.KLDivLossの引数で迷ったのでメモ

Last updated at Posted at 2019-08-07

なにをしたいか

  • nn.KLDivLossの引数を確認する

Kullback-Leibler Divergence

2つの分布の距離を表現できる。
分布Pと分布Qがどれくらい離れているかを評価する場合はこのように表現する

D_{KL}(P || Q) = \int_{-\infty}^{\infty}P(x)\log\biggl(\frac{P(x)}{Q(x)}\biggr)dx

離散確率分布であればシグマに置き換えればいい

D_{KL}(P || Q) = \sum_{x\in X}P(x)\log\biggl(\frac{P(x)}{Q(x)}\biggr)

nn.KLDivLoss

PとQの値
import torch
P = torch.Tensor([0.36, 0.48, 0.16])
Q = torch.Tensor([0.333, 0.333, 0.333])
普通に計算する場合
(P * (P / Q).log()).sum()
# tensor(0.0863)
nnから呼ぶ場合
import torch.nn as nn
kldiv = nn.KLDivLoss(reduction="sum")
kldiv(Q.log(), P)
# tensor(0.0863)
nn.functionalから呼ぶ場合
import torch.nn.functional as F
F.kl_div(Q.log(), P, None, None, 'sum')
# tensor(0.0863)
# 第3、4引数は、deprecatedなのでほっておいて、第5引数で設定する。

reductionの値は、デフォルトはmean。

  • 'none': no reduction will be applied
  • 'batchmean': the sum of the output will be divided by the batchsize
  • 'sum': the output will be summed
  • 'mean': the output will be divided by the number of elements in the output

References

(1) KL Divergence for two probability distributions in PyTorch
(2) Kullback–Leibler divergence
(3) KLDivLoss
(4) kl_div

14
4
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
14
4

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?