From where does the backward() method come in custom loss functions

Continuing the discussion from Custom loss functions: In this solution, @ptrblck demonstated custom loss function. However, the function my_loss returns a torch tensor. Then how come running .backward() on this tensor works in pytorch?

In my case, my loss function looks like this :

def my_loss(out, tar):
    loss = torch.sum(out * tar)
    return loss

Now this returned loss does not have any parameter or a method named backward(), then how can I run .backward() on the torch tensor returned by loss = my_loss(out, tar) ?

backward is not only a method of e.g. autograd.Function, but also a tensor method defined as torch.Tensor.backward, which is why you can call .backward() directly on tensors.

1 Like