7
5

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

"categorical_crossentropy"と"sparse_categorical_crossentropy"の違い

Last updated at Posted at 2020-07-30

結論

  • 使用するラベルが違います。違いはそれだけです。"categorical_crossentropy"にはonehot(どこか1つが1で他は全て0)のラベルを使用します。"sparse_categorical_crossentropy"のラベルには整数を使用します。

one-hotと整数表現の違い

例)10分類の場合

one-hot表現 整数表現
[0. 0. 0. 0. 0. 0. 0. 0. 0. 1.] [9]
[0. 0. 1. 0. 0. 0. 0. 0. 0. 0.] [2]
[0. 1. 0. 0. 0. 0. 0. 0. 0. 0.] [1]
[0. 0. 0. 0. 0. 1. 0. 0. 0. 0.] [5]

整数ラベルをone-hotラベルに変換

データセットでは整数ラベルのものが多い印象ですが,損失関数の多くは整数ラベルではなく,one-hotラベルを与えてあげないと動きません.そういう場合は変換する必要があります.(というか,むしろ"sparse_categorical_crossentropy"のように整数ラベルのまま学習できる損失関数が少数派だと感じます.)

以下にコードを記します.

import numpy as np

n_labels = len(np.unique(train_labels))
train_labels_onehot = np.eye(n_labels)[train_labels]

n_labels = len(np.unique(test_labels))
test_labels_onehot = np.eye(n_labels)[test_labels]
7
5
1

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
7
5

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?