Tensor require_grad

I have read the tutorial, and find when I new a tensor using Torch.Tensor, the default require_grad is false。However, when I feed the data to the network in this way, the Parameter of network can be optimized. Why? I think later Parameter require grad also should be false… I do not understand.

For training, the parameters do require gradients. The input does not, but operations with both parameter and input as arguments produce tensors requiring gradients.

Best regards

Thomas

Thanks so much for your reply. So, the require_grad is not for loss with respect to the input data, but for the Parameter of layer?

For trainin NNs yes. For other uses, eg adversarial examples or style transfer, you sometimes want gradients w.r.t. the input, too.

Best regards

Thomas

Thank you too too much, but ememem, I know sth about gan, if I set the require_grad of input in discriminator false, Can the grad be bp to the generator?

If you set it to false no, but you would not have to set it to true yourself because as the output of the generator it will have it set already.

Best regards

Thomas

1 Like

Thanks too too too much for your So quick reply, I make it.