Skip to content

Commit 07ebbcb

Browse files
apaszkesoumith
authored andcommitted
Add Parameter docs
1 parent ca555ab commit 07ebbcb

File tree

2 files changed

+25
-0
lines changed

2 files changed

+25
-0
lines changed

docs/source/nn.rst

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,12 @@ torch.nn
77
.. automodule:: torch.nn
88
.. currentmodule:: torch.nn
99

10+
Parameters
11+
----------
12+
13+
.. autoclass:: Parameter
14+
:members:
15+
1016
Containers
1117
----------------------------------
1218

torch/nn/parameter.py

Lines changed: 19 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,25 @@
22

33

44
class Parameter(Variable):
5+
"""A kind of Variable that is to be considered a module parameter.
6+
7+
Parameters are :class:`~torch.autograd.Variable` subclasses, that have a
8+
very special property when used with :class:`Module` s - when they're
9+
assigned as Module attributes they are automatically added to the list of
10+
its parameters, and will appear e.g. in :meth:`~Module.parameters` iterator.
11+
Assigning a Variable doesn't have such effect. This is because one might
12+
want to cache some temporary state, like last hidden state of the RNN, in
13+
the model. If there was no such class as :class:`Parameter`, these
14+
temporaries would get registered too.
15+
16+
Another difference is that parameters can't be volatile and that they
17+
require gradient by default.
18+
19+
Arguments:
20+
data (Tensor): parameter tensor.
21+
requires_grad (bool, optional): if the parameter requires gradient. See
22+
:ref:`excluding-subgraphs` for more details.
23+
"""
524

625
def __init__(self, data, requires_grad=True):
726
super(Parameter, self).__init__(data, requires_grad=requires_grad)

0 commit comments

Comments
 (0)