Call ExponentialLR schedule per step or per epoch?

What is considered best practice when using an ExponentialLR schedule? Should the learning rate be updated every epoch (as the documentation suggests) or every optimization step?