Element 0 of tensor does not require grad_fn

Hello all,
I am experimenting with a custom IoU and trying to propagate the sum of IoU backward and the implementation is as follows -


def sumIoU(anchors, gt):
    num_anchors = anchors.shape[1]
    num_gts = gt.shape[1]
    net_IoU = 0
    for i in range(num_anchors):
        anchor = anchors[:, i, :]
        for j in range(num_gts):
            j_gt = gt[:, j, :]
            xi = torch.max(anchor[:, 0],j_gt[:, 0])
            yi = torch.max(anchor[:, 1], j_gt[:, 1])
            wi = torch.clamp(torch.min(anchor[:,2], j_gt[:, 2])-xi, min=0)
            hi = torch.clamp(torch.min(anchor[:,3], j_gt[:, 3])-yi, min=0)
            area_i = wi*hi
            area_u = ((anchor[:,2]-anchor[:,0])*(anchor[:,3]-anchor[:,1])) + \
                     ((j_gt[:,2]-j_gt[:,0])*(j_gt[:,3]-j_gt[:,1])) - area_i

            IoU = torch.clamp(area_i, 1e-5)/torch.clamp(area_u, 1e-7)
            net_IoU+=IoU
    return net_IoU

and Propagating it as -

IoU = sumIoU(anchors=anchors, gt=gt)
loss.backward()

I run into the following error -

RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn

The Returned value i.e IoU is tensor([[466.2932]], device='cuda:0')
I had once read that till we use torch functions we would be fine, but I dont know what is going wrong since I am not understanding the error properly. Could someone please tell what I should do to get rid of it
Sorry for the noob question…
TIA