About dice loss

class BinaryDiceLoss(nn.Module):
def init(self):
super(BinaryDiceLoss, self).init()

def forward(self, input, target):  #  input.size = target.size and size=(N,1,128,128,128)
    N = target.size(0)   # N is batchsize
    smooth = 1

    input_flat = input.view(N, -1)
    target_flat = target.view(N, -1)

    intersection = input_flat * target_flat

    loss = 2 * (intersection.sum(1) + smooth) / (input_flat.sum(1) + target_flat.sum(1) + smooth)
    loss = 1 - loss.sum() / N

    return loss

Hello, I train a semantic segmentation model and I use above BinaryDiceLoss to backward model parameters. But my model loss does not decline. The data input to the model is correct because loss is decline when I user another loss to backward. I want to know how to deal with this problem? Thank you. I want to know if above BinaryDiceLoss is correct?