Memory error when loading glove embedding

vectors = Vectors(name='glove.6B.200d.txt')  
TEXT.build_vocab(train_data, vectors=vectors, max_size=25000, unk_init=torch.Tensor.normal_)

On this machine, I can’t access the internet so I have manually copy them to the machine.

I get this error while loading the word vectors:

RuntimeError: Vector for token b'zsombor' has 223 dimensions, but previously read vectors have 300 dimensions. All vectors must have the same number of dimensions.

I have already specified a max_size, I am not sure what is happening.

I’m not deeply familiar with torchtext, but the error message seems to indicate that you’ve created these vectors previously and are now changing their dimension?
The code snippet looks as if you are initializing vectors new. Could you explain the use case a bit?