RuntimeError: Function FillBackward1 returned an invalid gradient at index 1 - expected type torch.cuda.FloatTensor but got torch.FloatTensor

Hello,

I am new to Pytorch, and I am doing some tests in autograd. I keep getting the following error using the following codes. Could someone explain why the function fill_() not working with GPU?

Code:

import torch
xx = torch.Tensor([1]).cuda()
xx.requires_grad = True
b = torch.Tensor(2).fill_(xx[0]).cuda()
c = sum(b)
c.backward()
xx.grad

Error:


RuntimeError Traceback (most recent call last)
in ()
4 b = torch.Tensor(2).fill_(xx[0]).cuda()
5 c = sum(b)
----> 6 c.backward()
7 xx.grad

1 frames
/usr/local/lib/python3.6/dist-packages/torch/autograd/init.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables)
91 Variable._execution_engine.run_backward(
92 tensors, grad_tensors, retain_graph, create_graph,
—> 93 allow_unreachable=True) # allow_unreachable flag
94
95

RuntimeError: Function FillBackward1 returned an invalid gradient at index 1 - expected type torch.cuda.FloatTensor but got torch.FloatTensor

Actually I figured it out. need to change

b = torch.Tensor(2).fill_(xx[0]).cuda()
into

b = torch.Tensor(2).cuda().fill_(xx[0])