Check if the multiple learning rate have decreased when using lr_scheduler

Hi.

I’m using the code below to apply different learning rate and decay the LR by a factor of 0.1 every 30 epoch. I’m wondering if there is a way to see if the LR have really decreased? and if it is applied to the three LR.

Thank you.

optimizer_ft = optim.SGD([
    {'params': base_params, 'lr': 0.01},
    {'params': model.model.fc.parameters(), 'lr': 0.1},
    {'params': model.classifier.parameters(), 'lr': 0.1}
], weight_decay=5e-4, momentum=0.9, nesterov=True)

# Decay LR by a factor of 0.1 every 30 epochs
exp_lr_scheduler = lr_scheduler.StepLR(optimizer_ft, step_size=30, gamma=0.1)

Try this

lr_scheduler.get_lr()[0]

Thank you.
get_lr will actually do the computation and return a value that is supposed to be the LR of the current epoch. But does it mean that all the LR are really updated accordingly? I want to be able to confirm the values from the LR param itself.
Unless I’m getting it wrong

Should be same as @fangyh said. But you can directly get the lr from the optimizer:

optimizer.param_groups[0][’ lr ']

1 Like

Thanks, this makes sens and it works