Run Optimization only on backpropagation of specific functions without running into the Error "Trying to backward through the graph second time"

I am implementing the architecture of the VAE/GAN, which have several loss functions during training. What they propose is to update the modules according to specific loss functions. The problem is that they reuse the same loss function for several backpropagations. If I do that I always get the error “RuntimeError: Trying to backward through the graph second time, but the buffers have already been freed”. Even when I use zero_grad() in between these backpropagations this error occurs. Is there an elegant way to reuse the same error function for different backpropagations?

Is backward(retain_graph = True) what you are looking for?

1 Like