Does Use scheduler need zero_grad()

Hi, If use the scheduler, do I need to use
optimizer.zero_grad()
and then

scheduler.step()

or I use:

scheduler.zero_grad()

Thank you in advance

1 Like

Hello,

I think optimizer.zero_grad() is okay, and lr_scheduler doesn’t have function .zero_grad(), and the optimizer is a part of the parameter when creating lr_scheduler like this:

        lambda1 = lambda epoch: pow((1-((epoch-1)/epochs)), 0.9)
        lr_scheduler = optim.lr_scheduler.LambdaLR(use_optimizer, lr_lambda=lambda1)
1 Like