One scheduler for multiple optimizers

Can we have one scheduler for multiple optimizers so the learning rate of multiple optimizers can decay in the same manner?