Confused about[0]

(Karl Tum) #1


When I saw some demo codes:

            outputs = model(inputs)
            _, preds = torch.max(, 1)
            loss = criterion(outputs, labels)
            # backward + optimize only if in training phase
            if phase == 'train':
            # statistics
            running_loss +=[0]

If we would like to extract the loss tensor from loss variable, why not use

What does[0] mean here?

(Thomas V) #2


There is:

  • loss the Variable,
  • the (presumably size 1) Tensor,
  •[0] the (python) float at position 0 in the tensor.

As such, by just using you would not run into the “keeping track over everything” problem (which would happen if you use loss and something is not volatile), but you would add torch tensors instead of just python numbers.

Best regards