Confusion in zero_grad operation

In pytorch examples(DCGAN),they made model grads to become zero (netD.zero_grad and netG.zero_grad) but not in optimizer.DCGAN main.py

whereas in other examples, they made a optimizer grad to zero and not in model grads.vae and super resolution.py file also

Which one is right ?

Note: I don’t have much experience in GANs

Both ways can work…

Is this correct, if I make zero_grad in both model and optimizer?

model_grad and optimizer_grad holds the same value?

What do you mean by model_grad and optimizer_grad? Gradients are only defined on variable that requires them, including model parameters.

There is no need to do both. optim.zero_grad just sets grad attributes on all the parameters it was given to zero.