here is my idea for an optimizer - you only calculate some derivatives on each step, other remain the same. So it works as a kind of momentum. But that would be useful if you could somehow calculate only some derivative in a way that is faster than calculating all of the derivatives. Is there a way to do that?
You can specify which inputs you would like to compute derivatives for via the inputs= arguments to .backward( or autograd.grad(.