0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

nn.Embedding の説明

Last updated at Posted at 2024-01-19

pytorchのnn.Embeddingに書いてある説明(以下の引用)がわからなかったので調べた
lookup table って何?って感じ

A simple lookup table that stores embeddings of a fixed dictionary and size. This module is often used to store word embeddings and retrieve them using indices. The input to the module is a list of indices, and the output is the corresponding word embeddings.

一番分かりやすかった記事
https://medium.com/@hunter-j-phillips/the-embedding-layer-27d9c980d124

訓練

(seq_length, vocab_size)のワンホットベクトルに対して、(vocab_size, dim) のパラメータを用意することで分散表現に変換する。
訓練時は普通にbackpropするので、毎回行列の積を計算する。

推論

vocab_size 個だけ存在する全てのトークンについて、パラメータ行列の積を計算することで得られる分散表現を保存しておくことで、トークンが入力されたときに行列の積を計算せずに済む。これがlookup tableのことです。

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?