0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

KV-memNN (Miller et al., 2016)論文備忘録

Last updated at Posted at 2020-05-31

簡単歴史

end-to-end Memory Network
KVmemNN
Translater
BERT

abstract

Knowledge Base(以下KB)よりも、柔軟性にとんでいる。
Key-Value MemoryNetworksは、$memory$とエンコーダを分けたから、柔軟性に富んでいる。

手法

image.png

学習するものは、$A, B, R_i$である。ここで、$(k, v)$は、最初に選定された値を用いている。ホップ数 $ j $ 回、巡回した後に、$H+1$が出力される。

モデル概要
・$Query$を入力
・$Key$に入る
・$Query$と最も関連性のある$Key$が発火する
・その部分の$Key$があいて、$Value$をとる
・繰り返す
・$Value$を絞っていく

$ y_i = \sum_{j = 1}^{n} \alpha_{i, j} V$
$\alpha_{i,j} = Attention(Query,Key) = softmax({QK^T})$

Attentionの復習

$Query$, $Key$, $Value$を用いて、注意機構が作られる。

$ Attention(Query,Key, Value) = softmax(\frac{QK^T}{\sqrt{d^k}})V$

$
ここで、Self-Attentionを考えると、(以下論文にはない)

$Q_i(x) := xW_i^Q + b_i^Q$
$K_i(x) := xW_i^K + b_i^K$
$V_i(x) := xW_i^V + b_i^V$

image.png

Key-Value Memories

Sentence Level < Windows Level < Window+Title < Window+Center Encording+Title

1. KB Triple

“subject, relation, object” = $(Key,Key,Value )$

2. Sentence Level

Both $Key$ and $Value$ = "bags-of-words" of the entire sentence

3. Windows Level

・文書を$Window = W$で分割する。
・$Key$ = "bags of words" in entire Window
・$Value$ = the center word in Window

4. Windows + Center Encording

よくわからん、、、

5. Window + Title

・$Key$ = "bags of words" in entire Window
・$Value$ = title

参考資料

[Key-Value Memory Networks for Directly Reading Documents](Key-Value Memory Networks for Directly Reading Documents)
[Attention Is All You Need](Attention Is All You Need)

0
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?