I have the following code and the last line is giving me that error. Is there a way I can do indexing copy like numpy?
N, C, H, W = features.size()
gram = torch.zeros((N, C, C))
for i in range(N):
f = features[i, :, :, :]
f = f.view(C, -1)
g = torch.mm(f, f.t())
gram[i, :, :] = g
Hi, which version of PyTorch do you use? I’m using 0.2.0 and there is no error with that code.
g.data, but note that you won’t have .backward through this.
how do i check pytroch version? i downloaded the most recent one from their website.
why can’t i use backward?
pip list or
And the reason you cannot backpropagete is you use
Tensor. You need to use
Variable instead to back-propagate.
Since PyTorch 0.1.12 (if I am not wrong), they have included a
torch.__version__ attribute that helps you find out which version of PyTorch you are using.
torch.__version__ on a Python interpreter after importing PyTorch to find out.
@mckowski any reason you dont want to make
gram a Variable? That would then fix the issue I think?
(Edit: by the way, if you have
requires_grad=False, it wont backprop through
gram, in case this was a concern?)