I am looking for a way to add regularization to the model weights when optimizing using the LBFGS optimizer. My validation error shows, that I am overfitting after ~20 epochs.

Is there a way to introduce regularization, other than early stopping in pytorch?

Not an expert here. According to the document, LBFGS has a `step`

function that takes in the loss_function. Maybe you can directly apply the L2 regularization in the loss_function? Write code to compute the sum of the L2 norm of all the params and return

```
l2_norm = torch.tensor(0.)
for p in model.parameters():
l2_norm += (p**2).sum() # something similar, you get the point
```

1 Like