Copy from Variable to torch.FloatTensor isn't implemented

I have the following code and the last line is giving me that error. Is there a way I can do indexing copy like numpy?

N, C, H, W = features.size()
gram = torch.zeros((N, C, C))
print(gram.size())
for i in range(N):
    f = features[i, :, :, :]
    print(f.size())
    f = f.view(C, -1)
    print(f.size())
    g = torch.mm(f, f.t())
    print(g.size())
    gram[i, :, :] = g
1 Like

Hi, which version of PyTorch do you use? I’m using 0.2.0 and there is no error with that code.

Use g.data, but note that you won’t have .backward through this.

how do i check pytroch version? i downloaded the most recent one from their website.

why can’t i use backward?

For example pip list or conda list.

And the reason you cannot backpropagete is you use Tensor. You need to use Variable instead to back-propagate.

Since PyTorch 0.1.12 (if I am not wrong), they have included a torch.__version__ attribute that helps you find out which version of PyTorch you are using.

Run torch.__version__ on a Python interpreter after importing PyTorch to find out.

1 Like

@mckowski any reason you dont want to make gram a Variable? That would then fix the issue I think?

(Edit: by the way, if you have requires_grad=False, it wont backprop through gram, in case this was a concern?)