Hello, I have two tensors with the following shapes ‘’’ A.shape = [2, 1, 64000], and B.shape=[2, 288, 16]’’’

Is it possible to concatenate the two tensors of the last dim? I tried ‘’’ torch.cat((A, B), dim=-1) ‘’’ but it complains because dim 0 which is (1 , and 288 ) are not equal

No, it’s not possible to concatenate them into a single “packed” tensor. Nested tensors would allow it, but I’m unsure what the current status of this feature is.

So I must convert for example the 288 channels in the second tensor to 1 channel and then concatenate it with the first tensor is it correct?

This also wouldn’t work, as you can concatenate tensors with the same shape other than the specified dimension where the tensors should be concatenated.

E.g. for `torch.cat((a, b), dim=x)`

all dimensions must match besides `dim=x`

.

okay if I have two tensors of shapes [2, 1, 16], and [2, 1, 64000] how I can concatenate them to be one tensor of shape [2, 1, 64016] ?

Sorry, your previous explanation would work in this case:

```
a = torch.randn([2, 1, 16])
b = torch.randn([2, 1, 64000])
c = torch.cat((a, b), dim=-1)
```

as I’ve mistakenly saw `torch.cat((A, B), dim=1)`

in your initial post instead of `dim=-1`

.

Yes, this is what I asked my question. Thanks