File tree Expand file tree Collapse file tree 2 files changed +25
-0
lines changed Expand file tree Collapse file tree 2 files changed +25
-0
lines changed Original file line number Diff line number Diff line change @@ -7,6 +7,12 @@ torch.nn
77.. automodule :: torch.nn
88.. currentmodule :: torch.nn
99
10+ Parameters
11+ ----------
12+
13+ .. autoclass :: Parameter
14+ :members:
15+
1016Containers
1117----------------------------------
1218
Original file line number Diff line number Diff line change 22
33
44class Parameter (Variable ):
5+ """A kind of Variable that is to be considered a module parameter.
6+
7+ Parameters are :class:`~torch.autograd.Variable` subclasses, that have a
8+ very special property when used with :class:`Module` s - when they're
9+ assigned as Module attributes they are automatically added to the list of
10+ its parameters, and will appear e.g. in :meth:`~Module.parameters` iterator.
11+ Assigning a Variable doesn't have such effect. This is because one might
12+ want to cache some temporary state, like last hidden state of the RNN, in
13+ the model. If there was no such class as :class:`Parameter`, these
14+ temporaries would get registered too.
15+
16+ Another difference is that parameters can't be volatile and that they
17+ require gradient by default.
18+
19+ Arguments:
20+ data (Tensor): parameter tensor.
21+ requires_grad (bool, optional): if the parameter requires gradient. See
22+ :ref:`excluding-subgraphs` for more details.
23+ """
524
625 def __init__ (self , data , requires_grad = True ):
726 super (Parameter , self ).__init__ (data , requires_grad = requires_grad )
You can’t perform that action at this time.
0 commit comments