Hi.

I’m using the code below to apply different learning rate and decay the LR by a factor of 0.1 every 30 epoch. I’m wondering if there is a way to see if the LR have really decreased? and if it is applied to the three LR.

Thank you.

```
optimizer_ft = optim.SGD([
{'params': base_params, 'lr': 0.01},
{'params': model.model.fc.parameters(), 'lr': 0.1},
{'params': model.classifier.parameters(), 'lr': 0.1}
], weight_decay=5e-4, momentum=0.9, nesterov=True)
# Decay LR by a factor of 0.1 every 30 epochs
exp_lr_scheduler = lr_scheduler.StepLR(optimizer_ft, step_size=30, gamma=0.1)
```