How does one use torch.optim.lr_scheduler.OneCycleLR()

I wanted to use torch.optim.lr_scheduler.OneCycleLR while training. Can some kindly explain to me how to use.
What i got from the documentation was that it should be called after each train_batch.

My confusions are as follows:

  • Does the max_lr parameter has to be same with the optimizer lr parameter?

  • Can this scheduler be used with Adam optimizer. How is the momentum calculated then ?

  • Let’s say i trained my model for some number of epochs at a stretch now, i wanted to train for some more epochs. Would i have to reset the the scheduler ?

Can anybody provide me a sort of a toy example/traininng loop that implements this scheduler ? Any help would be appreciated ?

P.S: I am kind of new to deep learning & PyTorch so my question might be somewhat silly.