Creating a custom loss function

In order to summarize: as soon as you can do all the operations of your loss function on a Variable (without any access to Variable.data) then you don’t need to define any backward method.

This works with all function/modules from
torch
torch.nn
torch.nn.fonctional

But does not work with functions from
torch.Tensor

If you are creating a function requiring access to the Variable.data (for ex, if you are using a function exclusively from torch.Tensor), then you need to extend autograd and to compute the derivative by hand for the backward method.

8 Likes