tom
(Thomas V)
May 5, 2017, 9:00pm
7
Hello @apaszke ,
I’m not sure whether it is relevant, but for me, fn.backward does not take create_graph
either, but backward(fn, create_graph=True)
works as expected.
This seems to be because right now
be constructed, allowing to compute higher order derivative
products. Defaults to ``False``.
"""
torch.autograd.backward(self, gradient, retain_graph, create_graph)
def register_hook(self, hook):
"""Registers a backward hook.
The hook will be called every time a gradient with respect to the
variable is computed. The hook should have the following signature::
hook(grad) -> Variable or None
The hook should not modify its argument, but it can optionally return
a new gradient which will be used in place of :attr:`grad`.
This function returns a handle with a method ``handle.remove()``
that removes the hook from the module.
Example:
>>> v = Variable(torch.Tensor([0, 0, 0]), requires_grad=True)
does not take create_graph.
Best regards
Thomas