How to stop weight-decay from affectnig PReLU?

Hi, so I would like to use the PReLU function, however as can be seen here, it warns us that we should not use weight-decay with it.

Now in my code, the weight-decay is being used, and it is called like this:

# Optimizer
optimizer = optim.Adam(net.parameters(), 
                      lr = float(args.learningRate), 
                      weight_decay = float(args.l2_weightDecay)
                      ) 

What I do not understand though, is how can I specify explicitly, that I do not want my PReLU layers to be affected by weight decay?

Thanks.

See per-parameter optimizer options.

1 Like

per-parameter optimizer options like this:
optim.SGD([
{‘params’: model.base.parameters()},
{‘params’: model.classifier.parameters(), ‘lr’: 1e-3}
], lr=1e-2, momentum=0.9)

but i don’t know which attribute of model include the PReLU,
optim.SGD([
{‘params’: model.conv.parameters()},
{‘params’: model.bias.parameters()},
{‘params’: model.prelu.parameters(), weight_decay=0}
], weight_decay=0.0001,lr=1e-2, momentum=0.9)
it’s right ?

1 Like