I just tried torch2.0, which was for sure amazing. However, on my research purpose, I would like to do double backward in my code, which is not currently supported by torch2.0. Would you have a rough estimate that how soon it will be supported?
I’m not sure what you mean by “double backward.” If you want to compute,
for example, second derivatives (e.g., a hessian), pytorch version 2.0.0
works for me:
>>> torch.__version__ '2.0.0' >>> t = torch.ones (1, requires_grad = True) >>> p = t**3 >>> d1 = torch.autograd.grad (p, t, create_graph = True) >>> d2 = torch.autograd.grad (d1, t) >>> d1 (tensor([3.], grad_fn=<MulBackward0>),) >>> d2 (tensor([6.]),)
If you have a different use case in mind, please post a fully-self-contained,
runnable script that shows what you are trying to do and how it fails,
together with its output.
The issue is tracked torch.compile with aotautograd does not support double backwards · Issue #91469 · pytorch/pytorch · GitHub if you are wondering about torch.compile support.
Hi Frank, thank you a lot for trying. Sorry I didn’t make it clear with my question. The torch.compile in Pytorch 2.0 doesn’t support double backward. Without torch.compile, everything seems fine.
Thank you soulitzer, yes, that was my concern.