Exercise:Softmax Regression
From Ufldl
(→Step 0: Initialize constants and parameters) |
|||
Line 105: | Line 105: | ||
<tt>max(M)</tt> yields a row vector with each element giving the maximum value in that column. <tt>bsxfun</tt> (short for binary singleton expansion function) applies minus along each row of <tt>M</tt>, hence subtracting the maximum of each column from every element in the column. | <tt>max(M)</tt> yields a row vector with each element giving the maximum value in that column. <tt>bsxfun</tt> (short for binary singleton expansion function) applies minus along each row of <tt>M</tt>, hence subtracting the maximum of each column from every element in the column. | ||
- | '''Implementation Tip: ''' Computing the predictions - you may also find <tt>bsxfun</tt> useful in computing your predictions - if you have a matrix <tt>M</tt> containing the <math>e^{\theta_j^T x^{(i)}}</math> terms, such that <tt>M(r, c)</tt> contains the <math>e^{\theta_r^T x^{(c)}}</math> term, you can use the following code to compute the hypothesis (by | + | '''Implementation Tip: ''' Computing the predictions - you may also find <tt>bsxfun</tt> useful in computing your predictions - if you have a matrix <tt>M</tt> containing the <math>e^{\theta_j^T x^{(i)}}</math> terms, such that <tt>M(r, c)</tt> contains the <math>e^{\theta_r^T x^{(c)}}</math> term, you can use the following code to compute the hypothesis (by dividing all elements in each column by their column sum): |
% M is the matrix as described in the text | % M is the matrix as described in the text | ||
Line 135: | Line 135: | ||
[[Category:Exercises]] | [[Category:Exercises]] | ||
+ | |||
+ | |||
+ | {{Softmax}} |