Hello, I want to convert Variable “FloatTensor” to “LongTensor” in the forward function. I want to know can it work in the backward? Thank you.
1 Like
No this won’t work.
A quick check
a = torch.ones(10, requires_grad=True) # Works
a = torch.ones(10, requires_grad=True, dtype=torch.long) # Error (Only Tensors of floating point can require gradient)
1 Like
In addition, casting is a type of quantization, which is not differentiable.
1 Like
Ok,Thanks too much, but I have another question. when one require_grad of Parameter is false, layers before it will not backpropogate? Is it right? I am confused about the meaning of require_grad. Could You help me explain it?