How to interleave two tensors along certain dimension?

Hi there,

Say if I got two tensors like [[1,1],[1,1]] and [[2,2],[2,2]], how could I interleave them along n_w or n_h dimension to get [[1,2,1,2],[1,2,1,2]] or [[1,1],[2,2],[1,1],[2,2]]? In TensorFlow I could achieve such goal using tf.reshape after tf.stack, but transferring to PyTorch using view after torch.stack does not return what I want.

Thanks.

1 Like

I’m sure there is a better way to achieve this, but as of now, you could do the following:

a = torch.Tensor([[1,1], [1,1]])
b = torch.Tensor([[2,2], [2,2]])

torch.stack((a, b), dim=0).view(2, 4).t().contiguous().view(2, 4)
>> 1 2 1 2
>> 1 2 1 2
>> [torch.FloatTensor of size 2x4]

torch.stack((a, b), dim=0).view(4, 2).t().contiguous().view(4, 2)
>> 1 1
>> 2 2
>> 1 1 
>> 2 2
>> [torch.FloatTensor of size 4x2]
7 Likes

I’m not sure if there is a better way, but your answer definitely saved my day. :grinning:

Cheers!

How come you don’t do

torch.stack((a,b), dim=2).view(2,4)
>> tensor([[ 1.,  2.,  1.,  2.],
           [ 1.,  2.,  1.,  2.]])

Similarly, the vertical interleave can be:

torch.stack((a,b), dim=1).view(4,2)
>> tensor([[ 1.,  1.],
           [ 2.,  2.],
           [ 1.,  1.],
           [ 2.,  2.]])

I just started using pytorch on version 0.4 so maybe it’s something they added recently. Also, I’m sure there’s a good reason you’ve made calls to contiguous(), is there something I should know about the view?

6 Likes

That’s what I mean by “I’m sure there is a better way to achieve this”. :wink:
Thanks for the code. It’s a lot clearer than my approach.

I’ve called contiguous on the result, since I transposed it with t(). This might lead to errors for some operations that require contiguous tensors:

a = torch.randn(2, 4)
print(a.is_contiguous())
print(a.t().is_contiguous())
1 Like
from einops import rearrange
rearrange([a, b], 't h w -> h (w t)')  # [[1,2,1,2],[1,2,1,2]]
rearrange([a, b], 't h w -> (h t) w')  # [[1,1],[2,2],[1,1],[2,2]]

Same code in tensorflow and numpy

4 Likes