And I am using set_requires_grad() before training
x_tensor.set_requires_grad(true);
The function set_requires_grad() returns a Tensor.
It it the same x_tensor? Should I capture it? Should I use it? Is set_requires_grad() changing x_tensor in place?
You would need to assign and use the returned tensor. Alternatively, you can use the inplace operation via .requires_grad_() (note the trailing underscore).