Stack expects each equal size

stack expects each tensor to be equal size but got [7, 768] at entry 0 and [8, 768] at entry 1

    ca_embedding = torch.stack(tns_embedding) 
    embeddings.append(ca_embedding)

the shape for the first is ca_embedding shape is torch.Size([7, 768])
the shape for the second is ca_embedding shape is torch.Size([8, 768])

@Tim5

Recreating the problem

a=torch.rand(7, 768)
b=torch.rand(8, 768)
torch.stack([a,b])
RuntimeError: stack expects each tensor to be equal size, but got [7, 768] at entry 0 and [8, 768] at entry 1

Solution

torch.concat([a,b])

Thanks a lot , i will try but excuse me does concat work like stack ?

i got AttributeError: module 'torch' has no attribute 'concat'

Looks like you might be using a slightly older version of pytorch
You can use torch.cat in its stead

it works but excuse me does cat works as stack ? as the shape for both is different … i mean the shape of cat is one but for stack is two

@Tim5 there ware different ways of solving a problem

a=[torch.rand(1,10), torch.rand(1,10), torch.rand(1,10)]
print(torch.cat(a, axis=0).size())
print(torch.cat(a, axis=1).size())
print(torch.stack(a).size())
torch.Size([3, 10])
torch.Size([1, 30])
torch.Size([3, 1, 10])

Generally if you use stack in a loop then it can give problems

if my tensor gave me like that
_embedding shape is torch.Size([7, 768])
_embedding shape is torch.Size([8, 768]) … and so on
and i need to make it to sum all different number like 7 , 8 , .... so i need to use this line
torch.cat(_embedding, axis=0) ,right ? as i got the summation only without 768

torch.stack and torch.cat both ‘concatenate’ Tensors but the key different is that torch.stack applies this concatenation along a new dimenion, thereby stacking the Tensors whereas torch.cat does a direct concatenation along already existing dimensions

For example,

a = torch.randn(7,768)
b = torch.randn(8,768)

torch.cat((a,b)).shape #returns torch.Size([15, 768])

Trying torch.stack results in the error,

RuntimeError: stack expects each tensor to be equal size, but got [7, 768] at entry 0 and [8, 768] at entry 1

because torch.stack takes Tensors of the same shape and stacks them into a larger Tensor. You can think of having a B number of matrices of size [N, N] and stacking them together to get a Tensor [B,N,N].

I’ll repeat your original problem with two new tensors that should explain the difference between torch.stack and torch.cat.

For example,

aa = torch.randn(4,768)
bb = torch.randn(4,768)
 
torch.stack((aa,bb)).shape #returns torch.Size([2, 4, 768]) #stacked along new dimension
torch.cat((aa,bb)).shape #returns torch.Size([8, 768]) #concatenated along existing dimension

This example should clearly show the difference between torch.stack and torch.cat. :slight_smile:

1 Like

really thanks for your help and the great illustration of the difference but i got the summation only … why 768 didn’t appear? it returns with me torch.Size([6144])

Which command did you use that returned torch.Size([6144])? (Also, what version of pytorch are you using?)

the version is 1.8.1+cu111
the command is
_embedding = torch.cat(tokens_embedding) as tokens_embedding hold different values every time from another loop

what size is tokens_embedding?

If all your token embeddings are of size torch.Size([768]) and say you have four of them. You’ll want to use torch.stack in this example,

tokens_embedding = [torch.randn(768) for _ in range(4)] #list of 4 Tensors of shape [768]

_embedding = torch.cat(tokens_embedding)
_embedding.shape #returns torch.Size([3072])
 
_embedding = torch.stack(tokens_embedding)
_embedding.shape #returns torch.Size([4, 768])

i already used stack before but got different shape for each time and that made a problem when i tried to implement this line after that

    embeddings.append(_embedding)

Can you post a minimal reproducible example? Because it’s hard to visualize what your error is without the broader context.

If it’s purely the original error, it’s because your Tensors are of different sizes and torch.stack won’t work, and you might need to use torch.cat instead.

the code is very huge to post as for loops and the file that i have also … i will describe my problem in the following lines

i implemented this line _embedding = torch.stack(tokens_embedding) inside loop every time it read a sentences and got from it the following results

size is  torch.Size([8, 768])
size is  torch.Size([9, 768])
size is  torch.Size([10, 768])
size is  torch.Size([13, 768])
size is  torch.Size([7, 768])
and so on

after it finished, i should use this line
embeddings.append(_embedding)
but got

RuntimeError: stack expects each tensor to be equal size, but got [7, 768] at entry 0 and [8, 768] at entry 1

And you want to stack these Tensors here? That won’t work because of the original problem, you could use torch.cat instead. Although, you’ll loose the information about which sets of rows correspond to what original Tensors.