How to apply dropout, if num_layers=1 in a recurrent network?

Hey ,
as far as I know, dropout is only applied if the number of layers is greater than 1 right?
How can I apply the same dropout effect if I’m using just one layer?
Is just as simple like this:

 self.dropout = nn.Dropout(0.5)
...
out, h = self.rnn(X, (h,c))
out  = self.dropout(out)
out = self.f_c(out)

Does this mean the same dropout mask is applied to each time step? So in every item in the hidden state, the same units are set to zero ?

Is this a common way?
Thanks for helping.

I tried it out, it’s independ from each timestep, isn’t ?