Yes, the learning rates of each param_group
of the optimizer will be changed.
If you want to reset the learning rate, you could use the same code and re-create the scheduler:
# Reset lr
for param_group in optimizer.param_groups:
param_group['lr'] = init_lr
scheduler = optim.lr_scheduler.StepLR(optimizer, step_size=1, gamma=0.1, last_epoch=-1)