DCGAN - Trying to backward through the graph a second time

Hi Uygar!

As a general principle, don’t just try things. Analyze the actual cause
of your specific issue and use the “option” that actually addresses that
cause.

Please look at the comments that I’ve added in line to your quoted code:

The key issue is that you are using discriminator_fake_out to
optimize both discriminator and generator (leading in this particular
attempt to the “backward a second time” error). If you try using
retain_graph = True without doing things just so, you are
likely to get an inplace-modification error.

Probably the simplest way to address this issue with training GANs is
to rebuild the discriminator computation graph by calling – at added
computational cost – discriminator_fake_out = discriminator (...)
twice, once for the discriminator_fake_loss.backward()
backpropagation and then again for the generator_loss.backward()
backpropagation.

Yes, as explained in the in-line comments I added to your code.

I don’t believe this. For the version of the code you posted, you will get the
“backward a second time” error after calling generator_loss.backward(),
regardless of whether discriminator_optimizer.step() was called or not.

Perhaps in a different version of your code you had an inplace-modification
error that removing discriminator_optimizer.step() appeared to fix.

If you have (or do) come across inplace-modification errors, it will be
because discriminator_optimizer.step() is modifying discriminator’s
parameters inplace.

A discussion about fixing inplace-modification errors that includes a
toy-GAN example can be found in this post:

Best.

K. Frank

1 Like