How to use lr_scheduler in opacus?

Hi,
Thanks for this great project! I want to adjust the learning rate in opacus. When I create scheduler before attaching optimizer just like the following code.

model = Net()
optimizer = SGD(model.parameters(), lr=0.05)
scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[30,80], gamma=0.1)
privacy_engine = PrivacyEngine(
    model,
    batch_size,
    sample_size,
    alphas=[10, 100],
    noise_multiplier=1.3,
    max_grad_norm=1.0,
)
privacy_engine.attach(optimizer)

It raises UserWarning: Seems like optimizer.step() has been overridden after learning rate scheduler initialization. What’s the potential risk in the above code?

I also try to attach optimizer before learning rate scheduler initialization, thus this warning goes away. Is it necessary?

Any help is appreciated!

Hello @hkz,
Thank you for surfacing this quirk.

Opacus does indeed override the optimizer.step() when privacy engine is attached to it. In the current implementation, we perform the per-sample gradient clipping and subsequently call the original step() function of the optimizer (opacus/privacy_engine.py at master · pytorch/opacus · GitHub). Therefore, your code should work correctly regardless of when you initialize the scheduler.

However, this is risky as it might not work correctly should the implementation of optimizer.step() change in a manner inconsistent with the scheduler’s expectations; although, I don’t see this happening.

Bottom line: heeding the warning is ideal and recommended. :slight_smile:

1 Like