Manually changing the gradients of parameters

Is it possible to use an autograd function to set manual gradients for parameters and then pass on a different set of gradients during the backward pass?
(Essentially, send back gradients x but then modify the gradient of the current parameter to y)

You can run your whole “right” backpropagation using .backward() and then change the gradient value of the parameter you want to set like this:

my_parameter._grad = Variable( target_values )

This is not recommended but it should solve your problem (if I understood it right). If there is a cleaner solution I would be glad to know.

It doesn’t work. This scheme only changes the gradients of “that” gradients. In other words, even though you change some parameters, you cannot expect the gradient flow backward from the parameters for the changed gradients.