Difference between torch.optim.lr_scheduler.CyclicLR and torch.optim.lr_scheduler.OneCycleLR?

Both the policies have cyclic learning rate but I am unable to make out how are the two scheduling policies different.Kindly advise how CyclicLR with different policies and OneCycleLR are different in PyTorch.