How to change learning rate when training?

I’ve been used Caffe which has the functionality of change the learning rate call ‘lr_policy’. This functionality is very convenient. I want to know whether there is a similar way like lr_policy’’ in pytorch Or should I code it by myself?

Yes, there is a group of function called lr_schedular. Please refer to this page:

Thank you for your help!