Combine 2 2D-tensors into a 3D tensor

Hi everybody,

I’m looking a way to do the following thing: Let’s assume we have a tensor A of dimension [N,F] and a tensor B of dimension [N,F], I would like to obtain a tensor C of dimension [N,N,2*F].

Is there a way to do this ?

Thanks for your help !

Tensor A has N*F elements
Tensor B has N*F elements
Tensor C will have N*N*F*2 elements. I would have expected you to want N*F*2 elements in C.

If you wanted C to have shape (N, 2*F) then torch.cat([A, B], dim=1) would do.

It wasn’t a typo, I indeed would like a tensor NxNx2F. Otherwise, cat would have been perfect :wink: However, I’m not aware of something like this. I thought about something like torch.repeat but didn’t find exactly what I’m looking for

You are going to have to spell out what you want to put in each position of C.

I need to implement a function given as input x: 1xF, y: 1xF and outputs a scalar. I’m looking for to implement it in a GPU efficient manner, i.e. create all pair x_i, y_j and compute a scalar with one operation. This is why I though about this (NxF)x(NxF) --> NxNx2F

Is N your batch_size?
In which case C of shape NxFxF seems more appropriate to me.

Yes exactly. Why do you think it would be more appropriate ?
My idea afterwards was to use a feed forward net to compute the attention coefficient, using 2F as input.

Well, if x contains F elements and y contains F elements then there are FxF pairings, not 2xF pairings.

But, I had forgotten that there are two elements in each pairing. So the size should be NxFxFx2.

I can’t see why you need the pairings but this should do it.

A = A.unsqueeze(2).repeat([1, 1, F]) # shape NxFxF with every value repeated along the last dim
B = B.unsqueeze(1).repeat([1, F, 1]) # shape NxFxF with every value repeated along the middle dim
C = torch.stack((A, B), dim=3)

C has 4 dimensions of size N, F, F, 2
C[n, fb, fa] contains the pairing A[fa], B[fb] from batch n.

Thank you for your answer, I’ll have a try later if it might fit. I see your point. The 2xF pairing it’s because I would like to concatenate the last dimension in order to have 2F and simply use a weight matrix 2Fx1 instead of FxFx1.

2xF is enough to store all elements of both A and B, but not enough to explicitly store all the pairings. That said, if you are just going to add a linear layer in order to produce one scalar of output, then 2xF should be fine, and this will do the job

C = torch.stack([A,B], dim=2)
C_2F = C.view(N, -1)

I think you highlight a good point ! Maybe I’m asking something which is not possible the way I explained it.

I give an example, it would be useful:

a = torch.FloatTensor([[1,2,3],[4,5,6]])
b = torch.FloatTensor([[10,20,30],[40,50,60]])

I would like as ouput something like: (final dimension NxNx2F. Here for simplicity it’s simply N*Nx2F)

1 2 3 1 2 3
1 2 3 4 5 6
1 2 3 10 20 30
1 2 3 40 50 60
4 5 6 1 2 3
4 5 6 4 5 6
4 5 6 10 20 30
4 5 6 40 50 60
10 20 30 1 2 3
10 20 30 4 5 6
10 20 30 10 20 30
10 20 30 40 50 60
40 50 60 1 2 3
40 50 60 4 5 6
40 50 60 10 20 30
40 50 60 40 50 60

Does it make more sense now ?

Not really.

a.size() == (2, 3), so N == 2, right?
Your proposed output seems to be of size (16, 6).

But 16 != 2*2

Oh yes sorry, I consider in my calculation that A and B was a single tensor. So if C=torch.cat([A,B], dim=0), it makes sense (because now N=4).

So basically, you concatenate a and b along the first dimension, and then take all pairings along that first dimension.

If so, my previous code can be adapted to do the job.

Thank you for your answer !

Finally I have:

a = torch.FloatTensor([[1,2,3],[4,5,6]])
b = torch.FloatTensor([[10,20,30],[40,50,60]])
c = torch.stack([a,b], dim=0)
N,F = c.size()

c_1 = c.repeat(1,N).view(N*N, -1)
c_2 = c.repeat(N,1)
c_2f = torch.cat([c_1, c_2], dim=1)
c_2f # Which can also be NxNx2F with view
    1     2     3     1     2     3
    1     2     3     4     5     6
    1     2     3    10    20    30
    1     2     3    40    50    60
    4     5     6     1     2     3
    4     5     6     4     5     6
    4     5     6    10    20    30
    4     5     6    40    50    60
   10    20    30     1     2     3
   10    20    30     4     5     6
   10    20    30    10    20    30
   10    20    30    40    50    60
   40    50    60     1     2     3
   40    50    60     4     5     6
   40    50    60    10    20    30
   40    50    60    40    50    60
[torch.FloatTensor of size 16x6]

Do you think there is a better way to do it in term of memory using another existing function instead of torch.repeat ?

1 Like