Copying model output to a torch.Tensor where requires_grad is True

That is from a time long gone and anwering a different question.

  1. No. Between creating a new tensor requiring grad and using .data, which you never should these days, you created a new leaf which will accumulate .grad.
  2. Because you requested it. no_grad signals that you do not need the grad, it does not include guarantees about the requires_grad of the result.
  3. If the utility function does not work for you, dropping the requires_grad and the .data should do the trick.

Best regards

Thomas

1 Like