Dear all,

im trying to use Lasso regression for loss function, any one have an idea how to use ?

thanks

Suppose you have an autoencoder (but this approch is valid for any other netweork).

You could write a function that takes the current weights values and computes the L1 regulatization (or Lasso regression).

For instance:

```
def L1_reg(self):
weights = torch.cat([x.view(-1) for x in self.encoder.parameters()])
L1_reg = LAMBDA_L1 * torch.norm(weights, 1)
return L1_reg
```

where `LAMBDA_L1`

is the lambda paramether of the Lasso (remember that this is a hyperparameter). With this little function you only compute the L1 regression on the encoderâ€™s weights, but feel free to use also the decoderâ€™s ones if you want to!

If you want to use the Ridge, use `2`

insted of `1`

in `torch.norm`

.

Check the torch.linalg.norm method to better understand it.

Hope Iâ€™ve been helpful!