Looking for advice regarding schedulers: learning rate and milestones

Hello everyone!

I am quite new at PyTorch, still trying to figure out all the ins and outs. I am currently working on my M.Sc. Mathematics Thesis and I was wondering if anyone could give me some advice on setting milestones for my scheduler: what are your go-to solutions when trying to set your scheduler (e.g., how many and which milestones, the learning rate, etc.)? I know that there’s no perfect recipe when it comes to Neural Networks, but I was wondering if there is a smart way (besides brute force) to understand what your data might need. Right now I am working with medical images (PET and CT) and I started playing with the MultiStepLR scheduler.

Any advice is very much appreciated. Thank you very much in advance.

Sincerely,

Nicolas Destefanis

1 Like

I don’t know if there are good approaches of setting milestones besides running a lot of experiments.
As the default I would try to reduce the learning rate once the validation loss doesn’t decrease significantly anymore via lr_scheduler.ReduceLROnPlateau.

1 Like