1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 5 years have passed since last update.

【今日のアブストラクト】BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding【論文 DeepL 翻訳】

Posted at

1 日 1 回 論文の Abstract を DeepL 翻訳の力を借りて読んでいきます.

この記事は自分用のメモみたいなものです.
ほぼ DeepL 翻訳でお送りします.
間違いがあれば指摘していだだけると嬉しいです.

翻訳元
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

Abstract

訳文

BERT は Bidirectional Encoder Representations from Transformers の略で, この新しい言語表現モデルを紹介する. 最近の言語表現モデルとは異なり, BERT はすべての層で左と右の両方の文脈に共同で条件付けを行うことで, ラベル付けされていないテキストから深い双方向性表現を事前に訓練するように設計されている. その結果, 事前訓練された BERT モデルは, わずか $1$ つの追加出力層で微調整することができ, タスク固有のアーキテクチャを実質的に変更することなく, 質問回答や言語推論などの幅広いタスクのための最先端のモデルを作成することができる.
BERT は, 概念的にシンプルで経験的に強力である. GLUE スコアを $80.5%$ ($7.7%$ポイントの絶対的改善), MultiNLI 精度を $86.7%$ ($4.6%$の絶対的改善), SQuAD v1.1 質問応答テスト F1 を $93.2$ ($1.5$ ポイントの絶対的改善), SQuAD v2.0 テスト F1 を $83.1$ ($5.1$ ポイントの絶対的改善) に押し上げるなど, $11$ の自然言語処理タスクで新たな最先端の結果を得ている.

原文

We introduce a new language representation model called BERT, which stands for Bidirectional Encoder Representations from Transformers. Unlike recent language representation models, BERT is designed to pre-train deep bidirectional representations from unlabeled text by jointly conditioning on both left and right context in all layers. As a result, the pre-trained BERT model can be fine-tuned with just one additional output layer to create state-of-the-art models for a wide range of tasks, such as question answering and language inference, without substantial task-specific architecture modifications.
BERT is conceptually simple and empirically powerful. It obtains new state-of-the-art results on eleven natural language processing tasks, including pushing the GLUE score to 80.5% (7.7% point absolute improvement), MultiNLI accuracy to 86.7% (4.6% absolute improvement), SQuAD v1.1 question answering Test F1 to 93.2 (1.5 point absolute improvement) and SQuAD v2.0 Test F1 to 83.1 (5.1 point absolute improvement).

1
0
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
1
0

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?