Seeking for clarification of dropout in RNN/LSTM

Based on this figure in the post: LSTM dropout - Clarification of Last Layer

The dropout is between layers but not between timesteps. Is that true?

Based on reading these code (StackedRNN): https://github.com/pytorch/pytorch/blob/master/torch/nn/_functions/rnn.py

The dropout between steps are not introduced yet…