Variational RNN

Original LSTM cell uses dropout that uses different mask at every time step which is ad-Hoc and it leads to unstable results.According this paper we should use same dropout masks at every time step. Variational RNN

Here is the screenshot what should ideally happen

Keras supports this with (dropout and recurrent dropout)
Is there any neat implementation for this pytorch? Thanks for Helping

I think the answer is no. I have also looked for this feature but not found anything. In my Keras model it improves accuracy by about 5% so it’s the main reason I haven’t been able to port my model to PyTorch yet.

I spoke to soon. Seems like this LSTM supports variational dropout: https://github.com/keitakurita/Better_LSTM_PyTorch