How to apply my vocabs to pretrained embedding?

class emb(nn.Module):
    def __init__():
        glove = vocab.GloVe(name='6B', dim=300)
        self.emb = nn.Embedding.from_pretrained(glove.vectors, freeze=False)
    def forward():
        in_context_emb = self.emb(in_context)

I’m using this code and in_context has 20001 words in it.
I wonder how 20001 words go into GloVe.
How can I efficiently apply glove vectors with my vocabs?

Did you solve this problem?