RNN training speed

Hello community, I might be mistaken but I have the impression that

  • training RNN is very slow (and even slower on GPU)
  • unless you can parallelize the training using batches (which would make GPU useful again)
  • but this entails that sequences have a fixed length (you can use an End Of Sequence though)

=> so basically to be able to train faster you need to fix the length of the sequence which to me is a bit killing the whole flexibility of RNN being able to train on variable length sequences.

Are these assertions correct ? Am I missing something ?

Thanks a lot