Defining my own loss function, I found that the customized loss function only includes the forward function, does the network will do backpropagation automatically? The same condition as activation function

Defining my own loss function, I found that the customized loss function only includes the forward function, does the network will do backpropagation automatically? The same condition as activation function. Any advice will be appreciated, thanks.

Yes.
This is the beauty of automatic differentiation in that as long as you use PyTorch operators and don’t break the computation graph between the previous layer and next layer, the network will perform backprop automatically.

Thanks for your reply. It is very useful for me. I am a newer. How to know whether my operation break the computation graph or not.