Does pytorch support finite difference differentiation for optimizers like ADAM?

Does pytorch support finite difference differentiation for optimizers like ADAM?

Are you trying to differentiate the optimizer itself?