How can I recalculate gradient according to a manipulated gradient?

This is the question about Autograd and Pytorch’s gradient management system.

  1. Basically, my neural network is like the encoder part of VAE - it gets images as its input, and returns a vector R.

  2. With R, compute loss function L= SomeLossFunction( R ).

  3. After performing L.backward(), R will have its own gradient. I mean, there must be something like R.grad.

  4. What I want to do : adjust the R.grad manually, and re-compute all the gradient of all the weight parameters in NN according to this adjusted gradient.

Is this computation possible without fixing autograd system too much?

Not really sure what you want to do, but you could circumvent modifying autograd itself since you can directly access to gradient informations by sometensor.grad (or maybe there would be some proper way to do this) for Tensors of leaf nodes with requires_grad==True. In my case, when I faced a wall with autograd problem, this youtube vid really helped me a lot (13 min)

btw is it proper to link a youtube here? it is educational one about pytorch. :confused:

I’m really sorry. I noticed what I really wanted, thanks to your vid and tutorial - just modifying backward function like Relu solves my problem. Thanks for your help.

1 Like

Glad to hear that you found your own solution!