CUDA memory continuously increases when net(images) called in every iteration

@ruotianluo Thanks - if I understand you correctly, are you saying that, since I am accumulating a loss, that is composed of:

for i in xrange(100):
    out = net(input) 
    loss = someLossFunction(out)

, AND, since the loss here is a Torch Variable, that this will cause the graph to be built over and over again? Have I understood you correctly?