Size mismatch of embedding weights while loading

Hi I have saved my vocab file and my LSTM model when I try to load my load LSTM model and vocab dict I get a size mismatch error similar to this
Can ayone help!?

RuntimeError                              Traceback (most recent call last)
<ipython-input-41-cc9010adf021> in <module>()
      4 #from flask_ngrok import run_with_ngrok
      5 import pickle
----> 6 from predict import *
      7 import threading

1 frames
/usr/local/lib/python3.6/dist-packages/torch/nn/modules/ in load_state_dict(self, state_dict, strict)
   1050         if len(error_msgs) > 0:
   1051             raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
-> 1052                                self.__class__.__name__, "\n\t".join(error_msgs)))
   1053         return _IncompatibleKeys(missing_keys, unexpected_keys)

RuntimeError: Error(s) in loading state_dict for LSTM:
	size mismatch for embedding.weight: copying a param with shape torch.Size([10654, 100]) from checkpoint, the shape in current model is torch.Size([25002, 100]).

Did you manipulate this parameter after storing the state_dict?
If not, could you post a minimal code snippet to reproduce this issue, so that we could debug it?