How to fix size mismatch for embeddings.weight


I saved my model by running, ''). When I try to load it, I got the error:

size mismatch for embeddings.weight: copying a param with shape torch.Size([7450, 300]) from checkpoint, the shape in current model is torch.Size([7469, 300]).

I find it is because I use build_vocab from TEXT.build_vocab(train_data, vectors=Vectors(w2v_file)) would give different vocabularies each time, but I have to get the vocabulary to construct my model: def __init__(self, config, vocab_size, word_embeddings) .How can I fix it?

1 Like

Hi @Sirui_Li

Did you find a solution to the problem? I am getting similar problem, though in my case the mismatch is smaller.