Loss.backword() goes wrong

when I use loss.backward(), I meet the error of “RuntimeError: Function SubBackward0 returned an invalid gradient at index 0 - expected type torch.cuda.FloatTensor but got torch.cuda.DoubleTensor”
, I 'm sure that the type of variable “loss” is torch.cuda.FloatTensor . Because

isinstance(loss, torch.cuda.FloatTensor)

it show True.
Can you help me what’s wrong with the loss.backward().

The problem is not with the loss Variable, but probably with some other Variable in your model that should be FloatTensor but that is DoubleTensor.

Without more context, I can’t say much.