I noticed that in DCGAN implementation, bias has been set to False, is this necessary for GANs and why ?
I wondered this initially too. I think it’s because the beta term in batch norm effectively adds a bias to each channel.
Actually it seems reasonable to me not to use the bias with Relu and similar. Maybe others can elaborate mathematically.
@alexbellgrande is correct. It is because BN eliminates the bias, and BN (with affine=True) adds a bias so it’s equivalent.