Per parameter learning rate schedule

in fine-tuning, it is sometimes wanted to enforce different learning rates for different parameter groups.
it is well documented for the case of setting initial learning rate.
but what about the scheduler?
how can i set separate scheduler for different parameter groups?