Hello all. I am training a model based on GAN for segmentation. For training the above architecture, I have to train two phase: D network and G network. During the train G network, I do not want to update parameters in D network. How should I do it?
The loss of G network likes
loss = loss_G + 0.5 * loss_D
#Train G pred = netG(images) loss_S = criterionS(pred, targets) D_pred = netG(F.softmax(pred, dim=1)) optimizerS.zero_grad() loss_D = criterionD(D_pred, targets) loss = loss_S + 0.3 * loss_D loss.backward() optimizerD.step()
The first solution is
for param in netD.parameters(): param.requires_grad = True
The second solution is
loss_D = criterionD(D_pred.detach(), targets)
Which is correct?