Hogwild slow for training an RNN language model

Hello,

I am trying to implement a Hogwild-like training for a language model,

The model and the training procedure I am using is similar to http://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html where I only use the decoder part (no encoder or attention mechanism) also I use the DataLoader inside my code instead of randomly sampling sentences myself. The multiprocessing part is similar to the tutorial in http://pytorch.org/docs/master/notes/multiprocessing.html.

I found that each process is a lot slowed down compared to single process situation. Any idea please?

Thank you

1 Like

Did anyone figure this out? both data loading and training are slowed down while using Hogswild