Hey, I was working on Guided Backpropagation too using hooks. I was able to call the backward() and return new grad_in but I don’t think the updated grads are being used for further computation as the gradient of the prediction w.r.t. the input is the same regardless of whether the backward hook is registered or not.
Does a backward hook only save the changes to a module’s gradients and not use it in the next module’s computation?
Were you able to complete this code?