Did you try to set freeze=False as @SimonW suggested?
In case you would like to keep the embedding frozen, could you try to overfit a small data sample (e.g. just 10 samples)? If your model is not able to learn even this small data sample, something else might be wrong with your code.
@ptrblck
problem occurs when I use pretrained word embeddings. If I initialise embeddings randomly using nn.Embedding(vocabsize,embeddingdim) LSTM trains properly.