How can I concatenate tensors during the forward pass

This is what I have so far:

input_size_prot = 1024
input_size_comp = 196
hidden_size_prot = 32
hidden_size_all = 20
output_size = 1

batch_size = 80

class pcNet(nn.Module):

def __init__(self, input_size_prot, input_size_comp, hidden_size_prot, hidden_size_all, output_size):
    super(pcNet, self).__init__()
    self.fc_prot = nn.Linear(input_size_prot, hidden_size_prot)
    self.fc_all  = nn.Linear(hidden_size_prot+input_size_comp, hidden_size_all)
    self.fc2     = nn.Linear(hidden_size_all, output_size)

def forward(self, x):
    out = F.leaky_relu(self.fc_prot(x[0]))
    out = torch.cat((out, x[1]), 0) #here lies the problem
    out = F.leaky_relu(self.fc_all(out)) 
    out = F.relu(self.fc2(out))
    return out

So x is a list of two tensors. The first has 1024 elements and gets compressed to 32 nodes, which I now try to combine with the second tensor in x. Currently I get the following error:

RuntimeError: Sizes of tensors must match except in dimension 0. Got 32 and 196 in dimension 1

Hi,

Your input will be something like this for concatenation:

x = [torch.randn(1, 32), torch.randn(1, 196)]

So based on this line, I expect the desired out is [1, 196+32].

Based on this, you need to concatenate in second dimension not first. First one is for batch size.
So, this will solve this issue:

out = torch.cat((out, x[1]), dim=1)  # note that dim=1 -> [batch_size, 196], [batch_size, 32]

Bests

Nicely explained and it works. Thanks!