Any conditions on a custom loss in PyTorch

The documentation says that autograd will take care of backpropagation when defining the loss.

However, are there any conditions that we need to consider when defining a custom loss? For example, should the loss always be differentiable?

1 Like