Dropout in bidirectional LSTM

Hi, is there a way that I can add Dropout after each LSTM layer if I define a 3-layered LSTM like this?

self.rnn = nn.LSTM(hidden_dim, hidden_dim, num_layers=3, bidirectional=True)

The dropout argument should do exactly this:

dropout – If non-zero, introduces a Dropout layer on the outputs of each LSTM layer except the last layer, with dropout probability equal to dropout . Default: 0