Can I set torch.randint's requires_grad=True?

In the PyTorch 1.12 documentation, torch.randint have the parameter-requires_grad=False as default, but as I know only the “floatTensor” can set the requires_grad=True.Does anybody explain this issue for me or show me how to autograd the intTensor generated by “torch.randint”?

Is this relevant to something that you are looking for?

import torch
x = torch.randn(3, requires_grad=True) = torch.randint(5, (3,))

out -

tensor([2, 1, 0], requires_grad=True)

Edit: As a disclaimer, I don’t think this is an intended behaviour though. Looks like a bug to me.
See -

y = (x+ float(5.0)).sum()

gives an error -

RuntimeError: isDifferentiableType(grad.scalar_type()) INTERNAL ASSERT FAILED at "../torch/csrc/autograd/engine.cpp":746, please report a bug to PyTorch. 

Manipulating the .data attribute can yield unwanted side effects and is deprecated.
The argument might just be there for consistency reason to match other calls (that would be my guess at least), but you are right that integer tensors cannot require gradients:

x = torch.randint(3, 5, (3,), requires_grad=True)
# RuntimeError: Only Tensors of floating point and complex dtype can require gradients