How to fix memory issue when I use += operator

I’m testing following code which uses += operator

Accumulates loss
@ https://github.com/lixx2938/CGIntrinsics/blob/master/models/networks.py#L790

Gets final loss and I update network using that summed final loss
@ https://github.com/lixx2938/CGIntrinsics/blob/master/models/networks.py#L835


I’ve read this relevant link
@ https://discuss.pytorch.org/t/cuda-memory-continuously-increases-when-net-images-called-in-every-iteration


Above code (from CGIntrinsics project) has no problem about GPU memory issue (like continuously growing gradient graph)?
Because I’m continuously running into RuntimeError: CUDA error: out of memory