Change adam parameter Beta while training

Dear all,
I am learning semi-supevised learning in image classification. I notice that in Temporal Ensembling and Mean Teacher, the optimizer is Adam and they change the beta parameter in Adam during training. But I cannot find a solution to change the parameter in PyTorch. Does anybody know how to achieve the similar feature? Thanks.