Thanks for this great project! I want to adjust the learning rate in opacus. When I create
scheduler before attaching
optimizer just like the following code.
model = Net() optimizer = SGD(model.parameters(), lr=0.05) scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=[30,80], gamma=0.1) privacy_engine = PrivacyEngine( model, batch_size, sample_size, alphas=[10, 100], noise_multiplier=1.3, max_grad_norm=1.0, ) privacy_engine.attach(optimizer)
It raises UserWarning: Seems like
optimizer.step() has been overridden after learning rate scheduler initialization. What’s the potential risk in the above code?
I also try to attach
optimizer before learning rate scheduler initialization, thus this warning goes away. Is it necessary?
Any help is appreciated!