CUDA memory continuously increases when net(images) called in every iteration

Thanks @smth, I think I get it now, since the loss is a variable, it will keep making the graph longer and longer, connecting an entire (?) graph over and over again… I think.

Im just asking for the sake of understanding: Is what is happening, let me put it differently: Are you saying that this statement here, will make two graphs that are identical to each other?

loss = someLossFunction(input1) + someLossFunction(input2)

Is my conclusion correct?

1 Like