Concatenation of diff tensors

how to concatenate weights of different layers pf different dimensions e.g
torch.Size([64, 256, 4, 4])
torch.Size([1, 256, 1, 1])
torch.Size([256, 256, 3, 3])…

You can use torch.cat(tensors, dim) to concat two or more tensors. However, torch.cat expects the tensors to be of same shape, except in cat dimensions.

one way you could concat the different dimensions is to pad the tensor:

a = torch.zeros(64, 256, 4, 4)
b = torch.zeros(1, 256, 1, 1)
b = torch.nn.functional.pad(b, (2, 1, 2, 1), 'constant', 0)
e = torch.cat((a, b), dim = 0)

Out: torch.Size([65, 256, 4, 4])

Other way would be:

a = torch.zeros(64, 256, 4, 4)
a = torch.reshape(a, (-1, 256, 1, 1))
b = torch.zeros(1, 256, 1, 1)

c = torch.cat((a, b), dim = 0)

Out: torch.Size([1025, 256, 1, 1])

It entirely depends on the application or the result you are expecting.

Hope this helps

Thanks it worked :+1: