Load weights into a nn.Embedding with different size

I have two nn.Embeddings, A and B, with different embeddings num. B contains A. For an example:
A:

index    weight
1        [0.1, 0.2]
2        [0.4, 0.6]

B:

index    weight
1        zeros
2        zeros
3        zeros

Now I want to copy the weights in A to B. After copy B should be like:
B:

index    weight
1        [0.1, 0.2]
2        [0.4, 0.6]
3        zeros

Any workable method? I only know a function from_pretrained can achieve weight loading, but it requires the same structure.

Or… anyway to expand A? I don’t know if nn.Embedding is static.

do you mean something like this,

a = nn.Embedding(3, 2, padding_idx=2)
list(a.parameters())
[Parameter containing:
 tensor([[ 0.7825, -1.5956],
         [-1.9341,  0.7428],
         [ 0.0000,  0.0000]], requires_grad=True)]