Loss remains the same for every epoch

@ptrblck I think this behaviour is also the same when I maually detach a tensor?? (which makes sense)
For example, if I have a tensor a and I do something like?

a.cpu().detach().numpy()

I see the following question relevant and is exactly my usecase : Get values from tensor without detaching?

However, the question is unanswered. Can you @J_Johnson @ptrblck , please suggest a work around to detaching to so that my computation graph is not altered.