# Concatenate same vector to all raws of matrix

Hey, i would like to know how i can concatenate two tensors like this:

``````t1 = torch.rand(2, 10, 512)
t2 = torch.rand(2, 768)
``````

and get tensor like this:

``````>>> torch.Size([2, 10, 1280])
``````

Let’s assume that shapes are:

``````t1_shape = (batch_size, sequence_len, embedding_dim)
t2_shape = (batch_size, embedding_dim)
``````

I want to concatenate these tensors along the `embedding_dim`, so every different tensor in `sequence_len` dimension will be concatenated with the same t2 Tensor.

As a solution i see:

``````t2 = t2.unsqueeze(1)
t2.size()
>>> torch.Size([2, 1, 768])
t2 = torch.cat((t2,) * 10, dim=1)
t2.size()
>>> torch.Size([2, 10, 768])
torch.cat((t1, t2), dim=2)
``````

But i’m afraid this approach costs memory when duplicating t2 Tensor 10 times (in reality it can be much more).
Is there any memory efficient solution?
Thank you!

You could use `t2.unsqueeze(1).expand(-1, 10, -1)`. This will take no extra memory (before the cat, that is) as it is a view (print stride() and shape to see what is going on) and the 10-dimension is just stride 0 (i.e. all copies in the same location).

Best regards

Thomas

That’s what i was looking for!
Thank you