Torch.cat working on 1D Tensor, but not on 2D

I’m trying to apply a softmax to certain indexed of a 1D vector of size (batch_size, 30)

The code is like this:

    # vector of size [batch_size, 30]

    # area type
    a = F.sigmoid(x[:, 0])
    # curvature
    b = F.softmax(x[:, 1: 4], dim=1)
    # facilities for bicycles
    c = F.softmax(x[:, 4: 9], dim=1)
    # lane width
    d = F.softmax(x[:, 9: 12], dim=1)
    # median type
    e = F.softmax(x[:, 12: 22], dim=1)
    # number of lanes
    f = F.softmax(x[: , 22: 25], dim=1)
    # rest
    g = F.sigmoid(x[: , 25:])
    
    print(torch.cat([a, b, c, d, e, f, g], dim=1))

Afterwards I try to concatenate or stack the vectors a to g back to one array, but it shows me this error:

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-56-956fcd57bc35> in <module>()
      1 model.cpu()
----> 2 model(x[:5].cpu())

~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    323         for hook in self._forward_pre_hooks.values():
    324             hook(self, input)
--> 325         result = self.forward(*input, **kwargs)
    326         for hook in self._forward_hooks.values():
    327             hook_result = hook(self, input, result)

<ipython-input-55-3e40b2ee85c5> in forward(self, x)
     36         g = F.sigmoid(x[: , 25:])
     37 
---> 38         print(torch.cat([a, b, c, d, e, f, g], dim=1))
     39         #return torch.cat((a, b, c, d, e, f, g), dim=1)
     40 

RuntimeError: dimension out of range (expected to be in range of [-1, 0], but got 1)

Changing the dim to 0 or -1 doesn’t work. It works if y try the concatenation on a batch of 1.

Hi,

torch.cat can only concatenate on an existing dimension. So if you have 1D tensor, the only valid dimension is the 0th one.
If you want to add a new dimension along which to concatenate your tensor, use the torch.stack function.

1 Like

Hi albanD,

Thanks for the response.

The size of a and b are this:

print(a.size(), b.size())

torch.Size([5]) torch.Size([5, 3])

How would you apply torch.stack? I’ve tried dim = -2, -2, 0, 1, but seem to get errors.

inconsistent tensor sizes at /opt/conda/conda-bld/pytorch_1512387374934/work/torch/lib/TH/generic/THTensorMath.c:2864

If what you expect is a 5x4 tensor, with these a and b, you need to do:
torch.cat([a.unsqueeze(-1), b], 1) (EDITED: stack -> cat). so that stack works with a tensor of size 5x1 and one of size 5x3 and concatenate them along the 1th dimension to get a 5x4 tensor.

Ah, that seems logical…

However it still not seems to work:

 print(a.unsqueeze(-1).size(), b.size())
    print(torch.stack([a.unsqueeze(-1), b], dim=1))

Leads to:

torch.Size([5, 1]) torch.Size([5, 3])

---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-42-976b8e293ff3> in <module>()
      1 #x = Variable(torch.from_numpy(X[selection]), requires_grad=0)
      2 model.cpu()
----> 3 model(x[:5].cpu())

~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/torch/nn/modules/module.py in __call__(self, *input, **kwargs)
    323         for hook in self._forward_pre_hooks.values():
    324             hook(self, input)
--> 325         result = self.forward(*input, **kwargs)
    326         for hook in self._forward_hooks.values():
    327             hook_result = hook(self, input, result)

<ipython-input-41-27a480fa83be> in forward(self, x)
     51 
     52         print(a.unsqueeze(-1).size(), b.size())
---> 53         print(torch.stack([a, b], dim=1))
     54 
     55         #return x

~/anaconda3/envs/pytorch_p36/lib/python3.6/site-packages/torch/functional.py in stack(sequence, dim, out)
     62     inputs = [t.unsqueeze(dim) for t in sequence]
     63     if out is None:
---> 64         return torch.cat(inputs, dim)
     65     else:
     66         return torch.cat(inputs, dim, out=out)

RuntimeError: inconsistent tensor sizes at /opt/conda/conda-bld/pytorch_1512387374934/work/torch/lib/TH/generic/THTensorMath.c:2864

Sorry,

I meant to write cat above, I edited my comment.

also the code that is running is not the one you showed, from the stack trace, it is:

     52         print(a.unsqueeze(-1).size(), b.size())
---> 53         print(torch.stack([a, b], dim=1))

where the unsqueeze() is missing in the second call.

Yes, Thanks! That one is working! :slight_smile: