One optimizer for generator and discriminator in GAN

Hi,
recently I am learning the implementation of GAN with pytorch.

In the official version of PYTORCH LIGHTNING BASIC GAN TUTORIAL, two optimizers are used to train generator and discriminator as following.

    opt_g = torch.optim.Adam(self.generator.parameters(), lr=lr, betas=(b1, b2))
    opt_d = torch.optim.Adam(self.discriminator.parameters(), lr=lr, betas=(b1, b2))

What happen if only one Adam is used as following,

opt_g_d = torch.optim.Adam(
{‘params’: self.generator.parameters()},
{‘params’: self.discriminator.parameters()}, lr=lr, betas=(b1, b2))

1 Like

I have same question here. Hope someone will reply. Thanks.

Here’s my code.

class Generator(nn.Module):
    def __init__(self):
        pass
    
    def forward(self, x):
        pass

class Discriminator(nn.Module):
    def __init__(self):
        pass
    
    def forward(self, x):
        pass

class GAN(nn.Module):
    def __init__(self):
        super().__init__()
        self.G = Generator()
        self.D = Discriminator()
    
    def forward(self, x):
        pass

model = GAN()
optimizer = torch.optim.Adam(model.parameters())