How to set different learning rates multitude for weight and bias?

On Caffe, the lr for bias is usually set twice large as that for weights, by setting the lr_mult parameter. I see no similar choice on PyTorch. Is it possible to manage it?

Yes. http://pytorch.org/docs/master/optim.html#per-parameter-options

Thank you, but method in the link only works for settting different layers. I need to set different lr for weight and bias in a same layer

You can do it with per-parameter options. You just have to filter the bias params into one group and the rest into another.

named_parameters should be of help filtering the parameters.