In PyTorch 0.4 Variables are depreciated. Just to get more clarity. How to make sure that at certain point no new tensors would be retained?
For example, to collect loss I use to follow this steps
running_loss += loss – this is incorrect in context of Variable
running_loss += loss.data
Also, what is significance of .data without Variable.