Does lr_scheduler updates larning rate of each layer in differential learning rate?

I was trying to train resnet34 with differential learning rates ,with each layer having different learning rate, using `torch.optim.lr_scheduler.OneCycleLR .
does this lr-scheduler update each differential learning rate , i.e,learning rate of each layer?