How to train 2 network in sequential order simoultaniously using a single loss funtion in the end?

Hi,

I have two separately trained networks and I am stitching them together also I am intended (for fine tuning) to train them jointly based on the loss function on the output of the second network’s output, is this possible in pytorch or do I need to define a new class putting them into a single network? I look forward to your responses.

It is as simple as saying it, you do not need to define anything new. You can just do it this way:

out_1 = network_1(input)
out_2 = network_2(out_1)
loss = loss_function(out_2, target)
loss.backward()

If there is no processing in between, you can also put both of them in a Sequential by doing network = nn.Sequential(network_1, network_2)

2 Likes

Thanks for the response, this looks very well except that I don’t know how to tell my optimizer what parameters should be optimized. if I use net = nn.Sequential(net1, net2), can I only give the optimizer net.parameters()?

1 Like

Yes, it will work that way.
If you want to keep 2 separate networks, you have to do as explained in this post: Giving multiple parameters in optimizer

1 Like

Thank you very much, this definitely helps.