You can set the create_graph=True flag while running the backward pass to be able to backward through the backward pass.
But you won’t get a .grad.grad.
In general, when doing higher order derivatives, I would suggest that you use autograd.grad() that returns the gradients directly. That will avoid any confusion about gradient accumulation coming from different backward passes.