LSTM not training when pretrained vectors are loaded using nn.Embedding.from_pretrained

When I load vectors using nn.Embedding.from_pretrained(), training accuracy of the model doesn’t change after every epoch.

But I when initialize them randomly, training accuracy changes.

Here is my code:

Here is the data:

Any help is very much appreciated.

https://pytorch.org/docs/master/nn.html#torch.nn.Embedding.from_pretrained

see freeze arg

It is true by default

Hi @ptrblck

Can you please help me to solve this problem??

Did you try to set freeze=False as @SimonW suggested?
In case you would like to keep the embedding frozen, could you try to overfit a small data sample (e.g. just 10 samples)? If your model is not able to learn even this small data sample, something else might be wrong with your code.

@ptrblck
problem occurs when I use pretrained word embeddings. If I initialise embeddings randomly using nn.Embedding(vocabsize,embeddingdim) LSTM trains properly.

just set freeze=False

I found out the issue. I didn’t shuffle the data. Hence it is not training. Thank you @ptrblck and @SimonW for your valuable time.