Different learning rates for different modules of nn.Sequntial

I have an nn.Sequential (with legacy nn modules) model with 21 layers. I wish to use the learning rate 1 for all the layers except layer 5, 10, 15 and 20. For the remaining layers, I wish to use a learning rate of .1. I understand that I am supposed to group the parameters that use the same learning rate together. I am not sure about the best way to do it.

I have tried the following

list1 = nn.ParameterList()
for i in range(0,21):
    if(i%5!=0):
           list1.append(model.get(i).parameters())

However, this doesn’t compile.
Can anyone suggest a quick example that achieves the same?