Variable in 0.4

In PyTorch 0.4 Variables are depreciated. Just to get more clarity. How to make sure that at certain point no new tensors would be retained?
For example, to collect loss I use to follow this steps
running_loss += loss – this is incorrect in context of Variable
running_loss += loss.data
Also, what is significance of .data without Variable.

Thanks

.data has still the same semantics, but it’s recommended to use .detach() instead.
Have a look at the Migration Guide (“What about .data?” section).

To collect your loss, the recommended way is to use loss.item() instead of loss.data[0], since indexing into a scalar will give you a warning for now and possibly an error in future versions.