Recurrent dropout in LSTM


I’ve found something called “recurrent dropout” in the model which I’d like to implement. But I cannot find it in pytorch documentation (and I’ve found it in tensorflow documentation). Is there something like this in pytorch?

Best regards

I don’t think Pytorch supports recurrent dropout of the shelf. See if this is helpful.