im trying to use Lasso regression for loss function, any one have an idea how to use ?
Suppose you have an autoencoder (but this approch is valid for any other netweork).
You could write a function that takes the current weights values and computes the L1 regulatization (or Lasso regression).
def L1_reg(self): weights = torch.cat([x.view(-1) for x in self.encoder.parameters()]) L1_reg = LAMBDA_L1 * torch.norm(weights, 1) return L1_reg
LAMBDA_L1 is the lambda paramether of the Lasso (remember that this is a hyperparameter). With this little function you only compute the L1 regression on the encoder’s weights, but feel free to use also the decoder’s ones if you want to!
If you want to use the Ridge, use
2 insted of
Check the torch.linalg.norm method to better understand it.
Hope I’ve been helpful!