How to match parameters and optimizer state?

Hi there, I want to initialize a 24-layers transformer from a pre-trained 12-layers one (model + optimizer).

For example, I use the parameters of the layer[0] to initialize layer[13], and so on. As of now, everything is going well, but when I tried to initialize the optimizer in the same way, I found that the key of the optimizer state is not the name of the parameter, but some numbers like ‘140453283719280’, How do I match these numbers with the parameter names (And further, how can I extend them)?