How can I add different regularization to different layer

The default regularization not suit me, how can add different regularization to different layer type.

You could add an own regularization term to each layer. This post shows a similar example for L1Loss but on each parameter.

I have a similar need to have different regularization factor for different layers.

The post you have referred here calculates the regularization loss using the L1 norm of each parameter. But they add all the regularization_loss to the final loss. And during back propagation, this loss is common to all the parameters and they are not different for different parameters. in my case how to add different regularization factor for different layers?

1 Like

You could adapt the for loop and add the desired regularization to the corresponding layer.
In the current example, the code just iterates all parameters, so you could add some if conditions etc. (depends on your use case).

1 Like

Thanks for that. But it is still tough for me to grasp the idea. Can you show me an example?

Maybe for this one:
I have a LSTM model with 3 layers

lstm1 = nn.LSTM(150, 100, num_layers=1, batch_first=True)
lstm2 = nn.LSTM(100, 110, num_layers=1, batch_first=True)
lstm3 = nn.LSTM(110, 200, num_layers=1, batch_first=True)

and during backpropagation,
I want to apply the norm1(lstm1 weights) only on lstm1 and similarly for other layers. How should my loss function look like?