自编码算法与稀疏性
From Ufldl
Line 4: | Line 4: | ||
二审: 新浪微博,@大黄蜂的思索 http://weibo.com/u/1733291480 | 二审: 新浪微博,@大黄蜂的思索 http://weibo.com/u/1733291480 | ||
+ | |||
+ | 【原文】 | ||
+ | |||
+ | So far, we have described the application of neural networks to supervised learning, in which we have labeled | ||
+ | training examples. Now suppose we have only a set of unlabeled training examples <math>\textstyle \{x^{(1)}, x^{(2)}, x^{(3)}, \ldots\}</math>, | ||
+ | where <math>\textstyle x^{(i)} \in \Re^{n}</math>. An | ||
+ | '''autoencoder''' neural network is an unsupervised learning algorithm that applies backpropagation, | ||
+ | setting the target values to be equal to the inputs. I.e., it uses <math>\textstyle y^{(i)} = x^{(i)}</math>. | ||
+ | |||
+ | Here is an autoencoder: | ||
+ | |||
+ | 【初译】 | ||
+ | |||
+ | 目前为止,我们已经讨论了神经网络在监督学习中的应用。在监督学习中,训练样本是有类别标签的。现在假设我们只有一个没有类别标签的训练样本集合 <math>\textstyle \{x^{(1)}, x^{(2)}, x^{(3)}, \ldots\}</math> ,其中 <math>\textstyle x^{(i)} \in \Re^{n}</math> 。一个自编码神经网络是一种非监督学习算法,它使用了反向传播算法,并将目标值设为输入值,比如 <math>\textstyle y^{(i)} = x^{(i)}</math> 。下图是一个自编码神经网络的示例。 |