I have fine-tuned the pre-trained densenet121 pytorch model with dropout rate of 0.2.
Now, is there any way I can use dropout while testing an individual image?
The purpose is to pass a single image multiple times through the learned network (with dropout) and calculate mean/variance on the outputs and do further analysis.
you can set your whole network to
.eval() mode, but then set your
dropout layers to
You can use the
apply function to achieve this for example:
Thanks. apply is a very handy function.
Related to this, I have another question.
To achieve the same goal – that is to use dropout during testing–
- train a net with dropout and then
- during testing I could just set the net to .train(mode=True) and get the output with the same input for multiple runs without updating the network params after each run.
But even when I set the net to .train(), I got the same output during multiple runs with a single image.
Can you explain to me why my approach did not work ?
can you provide a simple example on how to do it via
apply to have dropout in the testing time?
You can do something like this:
Suppose model has dropout layers.
if type(m) == nn.Dropout:
Or more generaly:
for m in model.modules():
A typo edit suggestion:
for each_module in m.modules():
apply is the recommended approach as setting the whole network to
train mode will modify the statistics of any batch normalization layers you have in the network. Thanks for showing how to do it with that method!
Just to add my two cents:
it only works when you have registered dropout layer during initialization with
nn.Dropout layers. In case one uses functional dropout
F.dropout(x,training = self.training) in the
forwad() method as it is in densenet, such tuning off will not work. The only way to turn on the dropout during evaluation for me currently is to define the
forward() method again by replacing the
F.dropout(x,training = self.training) with
F.dropout(x,training = True).