Deriving gradients using the backpropagation idea

From Ufldl

Jump to: navigation, search
(Introduction)
Line 1: Line 1:
== Introduction ==
== Introduction ==
-
In the section on the [[Backpropagation Algorithm | backpropagation algorithm]], you were briefly introduced to backpropagation as a means of deriving gradients for learning in the sparse autoencoder. It turns out that together with matrix calculus, this provides a powerful method and intuition for deriving gradients for more complex matrix functions (functions from matrices to the reals, or symbolically, from <math>\mathbb{R}^{r \times c} \rightarrow \mathbb{R}</math>.
+
In the section on the [[Backpropagation Algorithm | backpropagation algorithm]], you were briefly introduced to backpropagation as a means of deriving gradients for learning in the sparse autoencoder. It turns out that together with matrix calculus, this provides a powerful method and intuition for deriving gradients for more complex matrix functions (functions from matrices to the reals, or symbolically, from <math>\mathbb{R}^{r \times c} \rightarrow \mathbb{R}</math>).
First, recall the backpropagation idea, which we present in a modified form appropriate for our purposes below:
First, recall the backpropagation idea, which we present in a modified form appropriate for our purposes below:

Revision as of 08:05, 29 May 2011

Personal tools