Add custom regularizer to MSELoss

Hi,

I’m trying to implement a custom regularizer, but my final loss diverges (like exp), my custom loss (and regularizer) function is :

jacobian_similarity = [[1, -1, 0], [1, 0, -1], [0, 1, -1]] #for 3 classes

def customized_loss(pred, target, x):
    def mean_square_loss(pred, target):
        return (((pred - target) ** 2).sum()) / pred.data.nelement()
        
    def regularizer(pred, x):
        r = 0
        for x_ in range(batch_size):
            for js in jacobian_similarity:
                pred[x_].backward(torch.FloatTensor([js]), create_graph=True)
                r += torch.norm(x.grad[x_], 2)
                x.grad.data.zero_()
        return r
    
    loss = mean_square_loss(pred, target)
    regu = regularizer(pred, x)

    return loss + 0.01*regu

jacobian_similarity is a list who has all possible ‘differences’, so when .backward() is calling with one of it inside it will compute the difference of jacobians rows, e.g.
with [1, -1, 0] i will have, jacobian_f1 - jacobian_f2.

My regularizer steps are :

  1. compute jacobian of each output wrt the input
  2. for each row in jacobian (f1,…,fi, … ,fn) compute the norm2 with all other rows (like norm2(f1,f2) + norm2(f1,f3) + … + norm2(f1,fn) + norm2(f2,f1) + …)
  3. add regularizer to the loss

Maybe my implementation of regularizer disconnect the graph somewhere ?