Concatenate the tensors within the tensors - Pytorch

hello,

I have a task to complete -
a is a tensor of shape torch.Size([2, 1, 25, 25])

From this tensor, I convert this shape to torch.Size([2, N, 25, 25]), where N is the variable,

If N>1, then the third and fourth dimension of tensor a should be concatenate N number of times, but the third and fourth dimension tensor will be different for every tensors in the first dimension, in this example - a tensor has 2 data/values in the first dimension

For example, if N=3, then the output tensor shape should be torch.Size([2, 3, 25, 25])

For achieving this, I tried list but it takes so much time, how to done this using pure pytorch tensor ops.

thanks

Hi Vishak,

To understand the problem, you have a Tensor with this dim [B, N, X, Y], where B=Batch size, N=Length of the data, and X, Y=dim of your input.

The issue that you are trying to solve is that every N-element has a different X, Y size.

If that is the problem and the issue that you are having, I would suggest padding using pad_sequence module, so every N-element has a constant size which is the max size in your dataset. Look into this post for more information.

Then, you will be using the stack module to stack those tensors on the second dimension.