Placing dropout when putting model.eval

Hi. In evaluation mode, do we need to still put the line of dropout? This is the code

import torch
import torch.nn as nn

x = torch.randn(4,4)
drop = nn.Dropout(0.5)
model = nn.Linear(4,6)
model.train()
model(drop(x))

model.eval()
d = model(drop(x))
n = model(x)
print(d == n)

This gives a matrix of 0s (all False). So which one should be correct? model(drop(x)) or model(x)? I’ve seen some GitHub codes use model(x), but then in this case how would the model scale the output at test time if drop(x) is removed at test time?

Usually you would create a model containing all layers. If you then call model(inputs), the forward pass will be executed (in your case with the linear layer and dropout). If your model is quite simple, you could use nn.Sequential, otherwise you would just write a custom nn.Module.

In your example code, you would have to call drop.eval(), to set the dropout layer to evaluation, as model does not contain this layer.

Thanks alot @ptrblck for your answer. If I had them in a class like:

class Model(nn.Module):

    def __init__():

        super(Model, self).__init__()
        self.layer = nn.Linear(4,6)
        self.dropout = nn.Dropout(p=0.5)
        self.classification = nn.Linear(6,3)

    def forward(x):
        out = self.dropout(self.layer(x))
        return self.classification(out)

I know that if I do model.eval, then dropout would switch to evaluation mode. However, assuming that I do not want to follow the forward function (call the layers manually), do I need to put model.dropout in between, as in:

model = Model()
model.eval()
x = torch.randn(4,4)
out1 = model.layer(x)
# Do I need to put model.dropout(out1)?
out2 = model.classification(out1)

Thanks!

If you want to call the modules manually, you would have to call self.dropout as well.

Actually it dosent matter whether you place self.dropout in evaluation or not. PyTorch uses inverted dropout, so the scaling is done at training time only, so that during evaluation time nothing needs to be scaled.

1 Like