Gradients of output w.r.t input

Hi,
I’m trying to get the gradients of output w.r.t input. However, require_grad of input is usually set to be False since we don’t need to update input. My question is, is there a way to set the require_grad of input variable to be True without updating the input in the training, so that I can have the gradients wrt input later on? Thank you guys!

yes, you could do that.
try

input.requires_grad = True

and after the loss.backward() call, you can access the gradients w.r.t input by x.grad.

4 Likes

Thank you! It worked!