Trainable Embeddings in pytorch

I’ve a system that loads pretrained [Glove/fasttext] embedding and use the same for a neural network. I do a look up from the embedding for every mini-batch. The issue is when I create the embedding with requires_grad = True, i get the following error.

RuntimeError: save_for_backward can only save input or output tensors, but argument 0 doesn’t satisfy this condition

Any help?

When i implemented the same in Tensorflow, I could proceed on with the logic that yielded better results when embedding was trainable= True.