No grad fn attribute for custom loss term

Ive implemented a new loss function that tries to measure the circularity of a segmentation based on image moments. Ive combined this with the dice loss so my new loss is:
L = dice_loss + log10(circ_loss)/3 + 1

When printing the variables L, dice_loss and circ_loss in the above I noticed that the circ_loss term had no grad_fn attribute.

I’ve got very little knowledge as to what this means, but I am worried this means backward prop is not taking into account the circ loss term. There is no error during training and the overall loss seems to be decreasing, but I’m just wondering what the lack of a grad fn attribute means.

Can provide any code that could be helpful.

EDIT:
Adding requires_grad = true during initalisation of leaf tensors worked to bring the grad_fn attribute. So code does not have the issue detailed in the question. Not sure how to close question.

Could you print the entire loss function code?

Initially the error seems to be coming from how you define circ_loss as it has no grad_fn attribute (which can be seen for dice_loss and L via the grad_fn=<RsubBackward> and grad_fn=<AddBackward0> terms respectively). If you’re creating some data (or importing it from a numpy array) make sure to tell PyTorch it requires a gradient via the requries_grad_() attribute.

Could you provide the code that computes circ_loss, starting from whatever inputs it is based on?