Training independent models simultaneously

Is it possible to train multiple models simultaneously?
For instance, suppose by nettwork class is Net.

net1 = Net()
net2 = Net()

Is it possible to train net 1 and net 2 simultaneously?

Thanks.

I assume you need to do this because you want to use different training data for the models? In that case, yes, that should be possible. I think you can wrap them with a wrapper module, sth like:

class WrapperModule(nn.Module):
    def __init__(self):
        super(WrapperModule, self).__init__()
        self.net0 = Net()
        self.net1 = Net()

    def forward(inputs):
        return [self.net0(inputs[0]), self.net1(inputs[1])]

net = WrapperModule()
opt = SomeOptimizer(net.parameters())
ddp = DistributedDataParallel(net)
ddp.forward(inputs).backward()
opt.step()

Sorry, just realized you didn’t mention DistributedDataParallel in the question. Is this for distributed training? Could you please provide more contexts?

You can pass each of your model to a different GPU. See How to deploy different scripts on different GPUs?