How to fix memory issue when I use += operator

I’m testing following code which uses += operator

Accumulates loss

Gets final loss and I update network using that summed final loss

I’ve read this relevant link

Above code (from CGIntrinsics project) has no problem about GPU memory issue (like continuously growing gradient graph)?
Because I’m continuously running into RuntimeError: CUDA error: out of memory