Skip to content

Commit 27e37a0

Browse files
committed
DOC make loss function in SGD consistent with subgradient. Comment by Martin Jaggi :)
1 parent d6eb004 commit 27e37a0

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

doc/modules/sgd.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -267,7 +267,7 @@ training error given by
267267

268268
.. math::
269269
270-
E(w,b) = \sum_{i=1}^{n} L(y_i, f(x_i)) + \alpha R(w)
270+
E(w,b) = \frac{1}{n}\sum_{i=1}^{n} L(y_i, f(x_i)) + \alpha R(w)
271271
272272
where :math:`L` is a loss function that measures model (mis)fit and
273273
:math:`R` is a regularization term (aka penalty) that penalizes model

0 commit comments

Comments
 (0)