Output from the Loss-function

Hey,

I am trying to understand a example regarding training of a simple, fully connected net.
The model is trained with the input “data”.

output = self.model(data)
loss = F.nll_loss(output, target)
if loss.cpu().data.numpy()[0] > 10000:

What is the loss.data doing? Somehow relating the loss to the input data?

Thanks!

Hi Johannes!

I’m not sure specifically what you’re asking, but here are some
comments:

In loss.cpu().data, data is a deprecated property that was used
to unwrap a Tensor from a Variable. (I believe this was prior to
pytorch version 0.4.0.)

It’s just a coincidence that the character string “data” in
loss.cpu().data happens to match that in self.model(data).
The two are used in different contexts, so the same character string
is allowed to refer to two different completely unrelated things.

In general, the line:

if loss.cpu().data.numpy()[0] > 10000:

seems to be garbled overkill. There is no need to retrieve the loss
value from the gpu or convert it to numpy to perform the > test.

In current versions of pytorch:

if  loss > 10000:

suffices.

In old (pre-0.4.0?) versions you have to unwrap the Tensor from the
Variable, and extract the python non-Tensor value from the Tensor.

Either:

if  loss.data[0] > 10000:

or:

if  (loss > 10000).data[0]:

works.

Best.

K. Frank

2 Likes

Thanks for this thorough answer! It’s exactly what I needed to understand.
As this comes from the Variable: in newer versions of PyTorch, is it better to just do your training and testing with tensors? Or is there any benefit from using Variable?

Hi Johannes!

With newer version of pytorch, don’t use Variables. They are
deprecated and don’t actually do anything. I think that
torch.autograd.Variable is just a stub that is there for backward
compatibility.

If you happen to be using an old version of pytorch, but don’t use
Variables, your code won’t silently fail – it will immediately complain
if you try to do any kind of autograd stuff on a raw tensor. So you
don’t need to use Variables “just to be safe.”

Best.

K. Frank

1 Like