I’m testing following code which uses += operator
Gets final loss and I update network using that summed final loss
I’ve read this relevant link
Above code (from CGIntrinsics project) has no problem about GPU memory issue (like continuously growing gradient graph)?
Because I’m continuously running into
RuntimeError: CUDA error: out of memory