Alternative method for changing learning rate multiplier?

It seems pytorch didn’t have learning rate multiplier parameter for each layer.

Is there a better way to change learning rate multiplier for different layer?