Checkpointing with torch.autograd.grad

Are there plans to implement checkpointing on torch.autograd.grad? Is this even possible?

Hi,

Unfortunately it is a very complex problem with a lot of edge cases. I don’t think there is anyone working on this right now.