Linear Decoders

From Ufldl

Jump to: navigation, search
 
Line 53: Line 53:
</math>
</math>
-
Of course, when using backpropagation to compute the error terms for the hidden layer:
+
Of course, when using backpropagation to compute the error terms for the ''hidden'' layer:
:<math>
:<math>
\begin{align}
\begin{align}
Line 61: Line 61:
Because the hidden layer is using a sigmoid (or tanh) activation <math>f</math>, in the equation above <math>f'(\cdot)</math> should still be the
Because the hidden layer is using a sigmoid (or tanh) activation <math>f</math>, in the equation above <math>f'(\cdot)</math> should still be the
derivative of the sigmoid (or tanh) function.
derivative of the sigmoid (or tanh) function.
 +
 +
 +
{{Languages|线性解码器|中文}}

Latest revision as of 04:06, 8 April 2013

Personal tools