Exponential decay learning rate

I am trying to implement – “The learning rate is decayed exponentially from 0.01 to 0.0001 for 30 epochs.”

I found lr_scheduler = optim.lr_scheduler.ExponentialLR(optimizer, gamma)

optimizer = optim.Adam()
lr_scheduler = optim.lr_scheduler.ExponentialLR(optimizer, gamma)

for epoch in range(nb_epoch):
        for data in train_loader:
                ....
                ....
                optimizer.step()
lr_scheduler.step()

So how do I use the epoch and learning rate 0.01 to 0.0001 in this case?

If you want to decay the learning rate every 30 epochs you could use
torch.optim.lr_scheduler.StepLR (optimizer , 30 , gamma=0.01)

Thanks for the response, but the decay has to be exponentially. So can I do the same in ExponentialLR ?

If I’ve understood it correctly you want to use exponential lr rate scheduler and starting at 0.01 you want it to decrease to 0.0001 after 30 epochs. We can do some calculations where we should have:

0.01 * gamma^30 = 0.0001, from this we can imply that gamma = 0.01^(1/30) = 0.85769…

So set gamma approximately to 0.86 and it should work as you wanted (approximately).

I think your lr_scheduler.step() should be inside the epoch for-loop.