Error-Pytorch-BiGAn

The link in my previous post explains this behavior in more detail.
The short version is:

  • loss_d and loss_ge were calculated in some code before your posted code snippet
  • optimizer_d.step() changes the parameters of the discriminator
  • loss_ge.backward() calculates the gradients by backpropagating through the discriminator into the generator
  • since the discriminator was already updated (changed parameters), the backward pass fails

To avoid these issues, you could e.g. update the discriminator before, perform the forward pass and loss calculation for the generator update, and calculate the gradients for the generator as the last step before calling optimizer_ge.step().