Creating a custom loss-function compatible with an automatic .backward()

Hello,

I’m currently looking to re-implement the paper Colorful Image Colorization, in which they implement a custom loss function. I’m wondering on how to go about to create a custom loss function, in which I doesn’t have to specify the gradients myself (compatible with autograd?).

I read through this thread, and if I have understand it correctly, as long as I use PyTorch operations to define a loss function, autograd will be able to generate the backward-steps automatically (which later can be used by the optimizer)?

Hi,

Yes exactly, you just need to implement your function with pytorch operations !
Then you can wrap it into an nn.Module if you want or just a regular python function. Whichever fits best in your code.