Lipschitz constant as regularization term

Hello everyone!
I want to add as regularization term the Lipschitz constant of a neural network which I compute after each weight update. If I just modify the loss function in the training function as : loss += Lipschitz_constant , I think that this won’t affect the training as I just added a number to the loss.
How can I add the Lipschitz constant (which of course is a function with respect to the weights W) to loss function as a function of weights and not just as a number (so the optimizer does not ignore the Lipschitz constant when optimizing).
Thanks a lot.

Hi,

You will have to compute the Lipschitz constant in a differentiable manner.
Or provide the formula for the gradient yourself via a custom autograd.Function.

What do you mean by differentiable manner ?

I mean that you’re using only Tensors and pytorch ops for which we can compute gradients. So that the automatic differentiation will be able to compute the gradient of the Lipschitz constant wrt to the parameters.