RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation torch version 2.0.0

Hi HJ!

You can frequently “fix” inplace-modification errors with pytorch’s allow-mutation context
manager. But this is usually just a work-around and often a cop-out.

This post is a discussion of how to find and properly fix inplace-modification errors.

This is likely the cause of your problem. critic_optimizer.step() performs inplace
modifications of critics’s parameters. But because you called .backward() with retain_graph = True`, those modified parameters are likely getting used again when
you backpropagate again through the old graph that you retained.

Best.

K. Frank