0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?

More than 3 years have passed since last update.

機械学習 第四章 主成分分析

Posted at

####アウトライン

  • 多変量データの持つ構造をより少数個の指標に圧縮

変量の個数を減らすことに伴う、情報の損失はなるべく小さくしたい。
少数変数を利用した分析や可視化(2、3次元の場合)が実現可能。

  • 学習データ
x_i=(x_{i1},x_{i2},...,x_{im})\in\mathbb{R}^m
  • 平均ベクトル
\hat{x}=\frac{1}{n}\sum_{i=1}^{n}x_i
  • データ行列
\hat{x}=\frac{1}{n}\sum_{i=1}^{n}x_i

-分散共分散行列

\sum=Var(\overline{X})=\frac{1}{n}\overline{X}^T\overline{X}
  • 線形変換後のベクトル
S_j=(S_1j,...,S_nj)^T=\overline{X}a_j\\
a_j\in\mathbb{R}^m

  • 係数ベクトルが変われば線形変換後の値が変化

情報の量を分散の大きさと捉える。
線形変換後の変数の分散が最大となる射影軸を探索

キャプチャ.PNG

0
1
0

Register as a new user and use Qiita more conveniently

  1. You get articles that match your needs
  2. You can efficiently read back useful information
  3. You can use dark theme
What you can do with signing up
0
1

Delete article

Deleted articles cannot be recovered.

Draft of this article would be also deleted.

Are you sure you want to delete this article?