Skip to content

Commit 1527b37

Browse files
jfsantosapaszke
authored andcommitted
Fixed typo and rendering of some equations (pytorch#693)
* Fixed typo and rendering of some equations * Few more fixes to MSELoss docs * Cleaning up whitespace to make pep8 happy
1 parent de46596 commit 1527b37

File tree

1 file changed

+10
-8
lines changed

1 file changed

+10
-8
lines changed

torch/nn/modules/loss.py

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -52,7 +52,7 @@ class L1Loss(_Loss):
5252

5353

5454
class NLLLoss(_WeighedLoss):
55-
r"""The negative log likelihood loss. It is useful to train a classication problem with n classes
55+
r"""The negative log likelihood loss. It is useful to train a classification problem with n classes
5656
5757
If provided, the optional argument `weights` should be a 1D Tensor assigning
5858
weight to each of the classes.
@@ -148,7 +148,8 @@ class KLDivLoss(_WeighedLoss):
148148
`input` `Tensor`.
149149
150150
The loss can be described as:
151-
:math:`loss(x, target) = 1/n \sum(target_i * (log(target_i) - x_i))`
151+
152+
.. math:: loss(x, target) = 1/n \sum(target_i * (log(target_i) - x_i))
152153
153154
By default, the losses are averaged for each minibatch over observations
154155
**as well as** over dimensions. However, if the field
@@ -164,13 +165,14 @@ class MSELoss(_Loss):
164165
r"""Creates a criterion that measures the mean squared error between
165166
`n` elements in the input `x` and target `y`:
166167
167-
:math:`loss(x, y) = 1/n \sum |x_i - y_i|^2`
168+
..math:: loss(x, y) = 1/n \sum |x_i - y_i|^2
168169
169-
`x` and `y` arbitrary shapes with a total of `n` elements each
170-
the sum operation still operates over all the elements, and divides by `n`.
170+
`x` and `y` arbitrary shapes with a total of `n` elements each.
171+
172+
The sum operation still operates over all the elements, and divides by `n`.
171173
172174
The division by `n` can be avoided if one sets the internal variable
173-
`sizeAverage` to `False`
175+
`sizeAverage` to `False`.
174176
175177
"""
176178
pass
@@ -180,11 +182,11 @@ class BCELoss(_WeighedLoss):
180182
r"""Creates a criterion that measures the Binary Cross Entropy
181183
between the target and the output:
182184
183-
:math:`loss(o, t) = - 1/n \sum_i (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i]))`
185+
..math:: loss(o, t) = - 1/n \sum_i (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i]))
184186
185187
or in the case of the weights argument being specified:
186188
187-
:math:`loss(o, t) = - 1/n \sum_i weights[i] * (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i]))`
189+
..math:: loss(o, t) = - 1/n \sum_i weights[i] * (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i]))
188190
189191
This is used for measuring the error of a reconstruction in for example
190192
an auto-encoder. Note that the targets `t[i]` should be numbers between 0 and 1,

0 commit comments

Comments
 (0)