x = Variable(torch.FloatTensor([[1,2],[3,4]]))
z = torch.cat([x[0,1],x[1,0]],1)
I got an error message:
Traceback (most recent call last):
File "pra.py", line 85, in <module>
z = torch.cat([x[0,1],x[1,0]],1)
File "/usr/local/lib/python2.7/site-packages/torch/autograd/variable.py", line 748, in cat
return Concat(dim)(*iterable)
File "/usr/local/lib/python2.7/site-packages/torch/autograd/_functions/tensor.py", line 303, in forward
self.input_sizes = [i.size(self.dim) for i in inputs]
RuntimeError: dimension 2 out of range of 1D tensor at /Users/soumith/code/pytorch-builder/wheel/pytorch-src/torch/lib/TH/generic/THTensor.c:24
But similar code like this:
z = torch.cat([torch.FloatTensor([2]),torch.FloatTensor([3])],1)
or this:
x = Variable(torch.FloatTensor([[1,2],[3,4]]))
z = torch.cat([torch.diag(x[0,1]),torch.diag(x[1,0])],1)
works well.
I’m totally confused. I really need to extract some elements from one tensor and cat them into another tensor. What should I do?
The tensors you’re putting into cat are only 1D and you’re trying to concatenate them along dim=1 i.e. dimension 2. If you give 0 to cat it will work ok. Remember that python is 0-based.
I’m not sure why does it work for regular tensors, I’ll need to look into that, but I think it’s an unintended behaviour.
Thanks for the explanation!
So what should I do if I want to get a 2D tensor from a bunch of 1D tensor? I tried this:
x = Variable(torch.FloatTensor([[1,2],[3,4]]))
z = torch.cat([x[0,1],x[1,0]],0)
y = torch.cat([z,z],1)
But got this:
Traceback (most recent call last):
File “pra.py”, line 86, in
y = torch.cat([z,z],1)
File “/usr/local/lib/python2.7/site-packages/torch/autograd/variable.py”, line 748, in cat
return Concat(dim)(*iterable)
File “/usr/local/lib/python2.7/site-packages/torch/autograd/_functions/tensor.py”, line 303, in forward
self.input_sizes = [i.size(self.dim) for i in inputs]
RuntimeError: dimension 2 out of range of 1D tensor at /Users/soumith/code/pytorch-builder/wheel/pytorch-src/torch/lib/TH/generic/THTensor.c:24
update:
Never mind, I did that using tensor.unsqueeze().