3
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 1 year has passed since last update.

BERTの事前学習 Next sentence prediction の実装

Posted at
from transformers import BertForNextSentencePrediction

nsp_bert = BertForNextSentencePrediction.from_pretrained('cl-tohoku/bert-base-japanese-whole-word-masking')
nsp_bert.eval()
prompt = '私の家族は5人家族です。'
next_sentence = '家族は、父、母、兄、私、妹です。'  

input_tensor = bert_tokenizer(prompt, next_sentence, return_tensors='pt')
print(input_tensor)

{'input_ids': tensor([[ 2, 1325, 5, 2283, 9, 76, 53, 2283, 2992, 8, 3, 2283,
9, 6, 800, 6, 968, 6, 1456, 6, 1325, 6, 4522, 2992,
8, 3]]), 'token_type_ids': tensor([[0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1]]), 'attention_mask': tensor([[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1,
1, 1]])}

list(bert_tokenizer.get_vocab().items())[:5]

[('[PAD]', 0), ('[UNK]', 1), ('[CLS]', 2), ('[SEP]', 3), ('[MASK]', 4)]

output = nsp_bert(**input_tensor)
print(output)

NextSentencePredictorOutput(loss=None, logits=tensor([[14.3159, 2.9107]], grad_fn=), hidden_states=None, attentions=None)

torch.argmax(output.logits)

tensor(0)

うまく予測出来ました。
ここまでが簡単な Next sentence prediction の実装です。

3
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
3
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?