|
|
Line 64: |
Line 64: |
| [[File:PCA-rotated.png|600px]] | | [[File:PCA-rotated.png|600px]] |
| | | |
- | | + | 这就是把训练数据集旋转到 <math>\textstyle u_1</math>,<math>\textstyle u_2</math> 基后的结果。一般而言,运算 <math>\textstyle U^Tx</math> 表示旋转到基 <math>\textstyle u_1</math>,<math>\textstyle u_2</math>, ...,<math>\textstyle u_n</math> 之上的训练数据。矩阵 <math>\textstyle U</math> 有正交性,即满足 <math>\textstyle U^TU = UU^T = I</math> ,所以若想将旋转后的向量 <math>\textstyle x_{\rm rot}</math> 还原为原始数据 <math>\textstyle x</math> ,将其左乘矩阵<math>\textstyle U</math>即可: <math>\textstyle x=Ux_{\rm rot}</math> , 验算一下: |
- | 【原文】:This is the training set rotated into the <math>\textstyle u_1</math>,<math>\textstyle u_2</math> basis. In the general
| + | |
- | case, <math>\textstyle U^Tx</math> will be the training set rotated into the basis
| + | |
- | <math>\textstyle u_1</math>,<math>\textstyle u_2</math>, ...,<math>\textstyle u_n</math>. | + | |
- | | + | |
- | One of the properties of <math>\textstyle U</math> is that it is an "orthogonal" matrix, which means
| + | |
- | that it satisfies <math>\textstyle U^TU = UU^T = I</math>.
| + | |
- | So if you ever need to go from the rotated vectors <math>\textstyle x_{\rm rot}</math> back to the
| + | |
- | original data <math>\textstyle x</math>, you can compute
| + | |
| | | |
| :<math>\begin{align} | | :<math>\begin{align} |
Line 79: |
Line 71: |
| because <math>\textstyle U x_{\rm rot} = UU^T x = x</math>. | | because <math>\textstyle U x_{\rm rot} = UU^T x = x</math>. |
| | | |
- | 【初译】:这就是训练数据集旋转到以<math>\textstyle u_1</math>,<math>\textstyle u_2</math>为基后的数据。但是通常情况下,训练数据集合通过<math>\textstyle U^Tx</math>旋转后会得到的是<math>\textstyle n</math>维的<math>\textstyle u_1</math>,<math>\textstyle u_2</math>, ...,<math>\textstyle u_n</math>。由于矩阵<math>\textstyle U</math>具有正交性,也就是说它满足<math>\textstyle U^TU = UU^T = I</math>,所以假若想从<math>\textstyle x_{\rm rot}</math>中还原<math>\textstyle x</math>,可以通过如下计算:
| + | == 数据降维 == |
- | | + | |
- | :<math>\begin{align}
| + | |
- | x = U x_{\rm rot} ,
| + | |
- | \end{align}</math>
| + | |
- | | + | |
- | 因为<math>\textstyle U x_{\rm rot} = UU^T x = x</math>
| + | |
- | | + | |
- | 【一审】:这就是训练数据集旋转到以<math>\textstyle u_1</math>,<math>\textstyle u_2</math>为基后的数据。但是通常情况下,训练数据集合通过<math>\textstyle U^Tx</math>旋转后会得到的是<math>\textstyle n</math>维的<math>\textstyle u_1</math>,<math>\textstyle u_2</math>, ...,<math>\textstyle u_n</math>。由于矩阵<math>\textstyle U</math>具有正交性,也就是说它满足<math>\textstyle U^TU = UU^T = I</math>,所以假若想从<math>\textstyle x_{\rm rot}</math>中还原<math>\textstyle x</math>,可以通过如下计算:
| + | |
- | | + | |
- | :<math>\begin{align}
| + | |
- | x = U x_{\rm rot} ,
| + | |
- | \end{align}</math>
| + | |
- | | + | |
- | 因为<math>\textstyle U x_{\rm rot} = UU^T x = x</math>
| + | |
- | | + | |
- | | + | |
- | 【二审】:这就是训练数据集旋转到<math>\textstyle u_1</math>,<math>\textstyle u_2</math>基上所得数据。在更一般的情况下,训练数据集<math>\textstyle U^Tx</math>旋转到基<math>\textstyle u_1</math>,<math>\textstyle u_2</math>, ...,<math>\textstyle u_n</math>之上。矩阵<math>\textstyle U</math>具备正交性,也就是说它满足<math>\textstyle U^TU = UU^T = I</math>,所以假若想将旋转矩阵<math>\textstyle x_{\rm rot}</math>还原为原始数据<math>\textstyle x</math>,可以通过如下计算:
| + | |
- | | + | |
- | :<math>\begin{align}
| + | |
- | x = U x_{\rm rot} ,
| + | |
- | \end{align}</math>
| + | |
- | | + | |
- | 因为<math>\textstyle U x_{\rm rot} = UU^T x = x</math>
| + | |
- | | + | |
- | | + | |
- | | + | |
- | == Reducing the Data Dimension 数据降维 ==
| + | |
| | | |
| | | |