Get gradient wrt to input

I have an image. I want to backprop the gradient all the way back to the input. I thought simply making the image be a variable with requires_grad=True and then call Loss.backward() would be enough. But this is not working. The values of the input image is the exact same after calling backward on the loss function, image.grad returns None, and image.requires_grad prints out True.

So how do you backprop the gradient through the network all the way back to the input?

Figured it out here: Strange behavior of Variable.cuda() and Variable.grad