Custom loss functions

No, Autograd won’t be able to track the numpy operations, so you would need to implement the backward pass manually via a custom autograd.Function as described here.

I don’t understand the second code snippet, as l_full as lwell as the other tensors are not used in the last HardTripletLoss example.

1 Like