I wanted to use
torch.optim.lr_scheduler.OneCycleLR while training. Can some kindly explain to me how to use.
What i got from the documentation was that it should be called after each
My confusions are as follows:
max_lrparameter has to be same with the
Can this scheduler be used with
Adamoptimizer. How is the momentum calculated then ?
Let’s say i trained my model for some number of epochs at a stretch now, i wanted to train for some more
epochs. Would i have to reset the the
Can anybody provide me a sort of a toy example/traininng loop that implements this scheduler ? Any help would be appreciated ?
P.S: I am kind of new to
deep learning &
PyTorch so my question might be somewhat silly.