Copy from Variable to torch.FloatTensor isn't implemented


#1

I have the following code and the last line is giving me that error. Is there a way I can do indexing copy like numpy?

N, C, H, W = features.size()
gram = torch.zeros((N, C, C))
print(gram.size())
for i in range(N):
    f = features[i, :, :, :]
    print(f.size())
    f = f.view(C, -1)
    print(f.size())
    g = torch.mm(f, f.t())
    print(g.size())
    gram[i, :, :] = g

(Moskomule) #2

Hi, which version of PyTorch do you use? I’m using 0.2.0 and there is no error with that code.


(Thomas V) #3

Use g.data, but note that you won’t have .backward through this.


#4

how do i check pytroch version? i downloaded the most recent one from their website.


#5

why can’t i use backward?


(Moskomule) #6

For example pip list or conda list.

And the reason you cannot backpropagete is you use Tensor. You need to use Variable instead to back-propagate.


(Vishwak Srinivasan) #7

Since PyTorch 0.1.12 (if I am not wrong), they have included a torch.__version__ attribute that helps you find out which version of PyTorch you are using.

Run torch.__version__ on a Python interpreter after importing PyTorch to find out.


(Hugh Perkins) #8

@mckowski any reason you dont want to make gram a Variable? That would then fix the issue I think?

(Edit: by the way, if you have requires_grad=False, it wont backprop through gram, in case this was a concern?)