Changing learning while training like in matlab?

Is there a option similar to matlab to pause training and change the learning rate and resume it in pytorch?

Question looks weirded but this kind of option is useful when you are creating a new model and training on weirded data-set

You can change the learning rate using torch.optim.lr_scheduler or by

for param_group in optimizer.param_groups:
      param_group['lr'] = lr

Usually, I save the model after some epochs for pausing. I’d like to know if you have better way.