Skip to content

Commit 2b1cd91

Browse files
alexis-jacqsoumith
authored andcommitted
Update extending.rst (pytorch#933)
1 parent 8e46a15 commit 2b1cd91

File tree

1 file changed

+13
-0
lines changed

1 file changed

+13
-0
lines changed

docs/source/notes/extending.rst

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -86,6 +86,19 @@ small helper functions::
8686
# return it.
8787
return Linear()(input, weight, bias)
8888

89+
You probably want to check if the backward method you implemented actually
90+
computes the derivatives of your function. It is possible by comparing with
91+
numerical approximations using small finite differences::
92+
93+
from torch.autograd import gradcheck
94+
95+
# gradchek takes a tuple of tensor as input, check if your gradient
96+
# evaluated with these tensors are close enough to numerical
97+
# approximations and returns True if they all verify this condition.
98+
input = (Variable(torch.randn(20,20).double(), requires_grad=True),)
99+
test = gradcheck.gradcheck(Linear(), input, eps=1e-6, atol=1e-4)
100+
print(test)
101+
89102
Extending :mod:`torch.nn`
90103
-------------------------
91104

0 commit comments

Comments
 (0)