Turn off dropout in RNN during training

I am currently freezing an RNN, which incorporates dropout, during the training. If I freeze the RNN, will that layer still use the dropout? If not, how do I also turn off the dropout?

bumping this for visibility

You can turn off the Dropout layer by calling .eval() of the layer or the model. If you want to freeze your parameters, you would have to set .requires_grad_(False) on the parameters.

1 Like

Sure. I did that. But what if I want to turn off dropout during training?

Just call eval on your Dropout layer not the whole model.

2 Likes

You can also give a dropout probability of 0.0 (a float value), if you don’t want any dropout to be applied during training (i.e. so that it won’t zero-out some of the tensors)

See here: https://github.com/pytorch/examples/blob/master/word_language_model/main.py#L34

1 Like