Mini Batch training for Word Embeddings example

Hello,

I am trying to modify the example shown here https://pytorch.org/tutorials/beginner/nlp/word_embeddings_tutorial.html#sphx-glr-beginner-nlp-word-embeddings-tutorial-py to take a mini-batch as input instead of looping through one n-gram pair at a time. While I tried doing this by modifying the layer input sizes, the network convergence slows down dramatically and essentially stalls. Is there a right way to do mini-batch training here using PyTorch?

Regards

1 Like

This is the exact issue I am having. Did you ever get this figured out?