First of all I’m sorry for my poor English.
I’m interested in computing hessian vector product for my optimization problem. As far as i know it’s possible to do with PyTorch using torch.autograd.grad
with create_graph=True
.
My question is: does torch.autograd.grad(create_graph=True)
works for user defined functions? (User defined means with custom forward/backward computations for example https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html)
If it’s possible can anyone explain me how PyTorch does it? Otherwise is there a way to create the user defined function which compatible with create_graph=True
?
Update:
I created a very simple minimal example of calling torch.autograd.grad with custom func https://gist.github.com/Daiver/1d702b5a9faf0a22e3007500c243da93 (btw i’m not sure that my example is correct)
It’s looks like it works - result seems to be correct but forward/backward were called only once so, probably PyTorch some how uses graph from forward call.