Multiple inputs in muti-task learning

Hi, Everyone
I hope can run a two inputs in multi-tasking learning
Here is the code:

class MTLnet(nn.Module):

def __init__(self):

    super(MTLnet, self).__init__()

    self.sharedlayer = nn.Sequential(

        nn.Linear(feature_size, shared_layer_size),

        nn.ReLU(),

        nn.Dropout()

    )

    self.tower1 = nn.Sequential(

        nn.Linear(shared_layer_size, tower_h1),

        nn.ReLU(),

        nn.Dropout(),

        nn.Linear(tower_h1, tower_h2),

        nn.ReLU(),

        nn.Dropout(),

        nn.Linear(tower_h2, output_size)

    )

    self.tower2 = nn.Sequential(

        nn.Linear(shared_layer_size, tower_h1),

        nn.ReLU(),

        nn.Dropout(),

        nn.Linear(tower_h1, tower_h2),

        nn.ReLU(),

        nn.Dropout(),

        nn.Linear(tower_h2, output_size)

    )    

def forward(self, x1,x2):

    h_shared = self.sharedlayer(x1)

    h_shared = self.sharedlayer(x2)

    out1 = self.tower1(h_shared)

    out2 = self.tower2(h_shared)

    return out1, out2

MTL = MTLnet()

print(MTL)

I am not sure in the forward () part is right.
but the results looks bad.
Does anyone of you know a smart way of solving this issue?

Hy @wqsgsutd

This is incorrect h_shared will get override in the second line which is causing problem

Hi, Usama,
Thank you.
is this right?
def forward(self, x1,x2):

h_shared1 = self.sharedlayer(x1)

h_shared2 = self.sharedlayer(x2)

out1 = self.tower1(h_shared1)

out2 = self.tower2(h_shared2)

return out1, out2

Yeah Seems fine for your use case.
:slight_smile:

Hi, Usama.

Thank you!