Check dropout is deactive

I have code below. I put the model in eval mode then also it print dropout layer when model is print. Is it common? How could i know it is deactivated once i change to evaluation mode.

DROP OUT CHECK

import torch
import torch.nn as nn
class Test(nn.Module):
def init(self, pool_size=(4, 4)):
super(Test, self).init()
self.layer = nn.Linear(1, 1, bias=False)
self.dropout = nn.Dropout()

def forward(self, x):
    out = self.dropout(self.layer(x))
    return out

model=Test();
model.train()
print("train model is : ",model)
model.eval()
print("eval model is : ",model)

Yes, the model still contains the dropout layer, which is why it’ll be printed, but would disable it.
You can check its training attribute, which should be set to False, via: print(model.dropout_layer.training).

Thanks a lot. Your answer cleared my doubt.

Hi, I just made a little bit change in the code and apply “DataParallel” as below. It shows the error "AttributeError: ‘DataParallel’ object has no attribute ‘layer_1’ ".

DROP OUT CHECK

import torch
import torch.nn as nn
class Test(nn.Module):

def __init__(self, pool_size=(4, 4)):
    super(Test, self).__init__()
    self.layer_1 = nn.Sequential(nn.Linear(1, 1, bias=False),
                 nn.Dropout())

  

def forward(self, x):
    out = self.dropout(self.layer_1(x))
    return out

model=Test();

ADDED EXTRA 2 LINES AS BELOW

model.to(0)
model = torch.nn.DataParallel(model, [0])

model.train();
print("train model is : ",model)
print("training ",model.layer_1[1].training)

model.eval();
print("testing model ",model.layer_1[1].training)

nn.DataParallel wraps the passed model into the .module attribute, so you would have to access it via model.module.layer1.

Your prompt reply is really appreciated. Thanks.