Expand a 2d tensor to 3d tensor

Let’s say I have a 2d tensor A

A = [[0,1,2],

I want to copy each row 10 times and stack them, which will then give me a 3d tensor. So I will have 3 x 3 x 10 tensor.

How can I do this?

I know that a vector can be expanded by using expand_as, but how do I expand a 2d tensor?

Moreover, I want to reshape a 3d tensor.
So for example, 2 x 3 x 4 tensor to 3 x 2 x 4.

How can I do this?


You can use unsqueeze to add another dimension, after which you can use expand:

a = torch.Tensor([[0,1,2],[3,4,5],[6,7,8]])
a = a.expand(3,3,10)

This will give a tensor of shape 3x3x10. With transpose you can swap two dimensions. For example, we can swap the first with the third dimension to get a tensor of shape 10x3x3:

a = a.transpose(2, 0)

I have

A = [[0,1,2], [3,4,5], [6,7,8]] [[9,10,11], [12,13,14], [15,16,17]]
So The dimension is 2 x 3 x 3 (batch size, len, features). How can I now repeat each row batchwise.

Assuming that A is a list, then you can do the following A = torch.tensor(A*10). But if A is a tensor not a list, then you can split A to a list, and then repeat each element and convert it back to tensor:

>>> A2 = torch.cat(list(torch.split(A, 1, dim=0))*10)
>>> A2.shape
torch.Size([20, 3, 3])

Is there any way to have output like [[0,1,2],[0,1,2],[0,1,2],...10 times [3,4,5], [3,4,5]...10 times]. I am trying to reproduce https://stackoverflow.com/questions/49358396/pytorch-how-to-implement-attention-for-graph-attention-layer actually, just for reference.

There is a numpy solution i know: np.tile([0, 1, 2], 10)

It did not helping as it is a 3D tensor


There can be a simpler way :slight_smile:, but this way should work.