I have fine-tuned the pre-trained densenet121 pytorch model with dropout rate of 0.2.
Now, is there any way I can use dropout while testing an individual image?
The purpose is to pass a single image multiple times through the learned network (with dropout) and calculate mean/variance on the outputs and do further analysis.
To achieve the same goal – that is to use dropout during testing–
I thought
train a net with dropout and then
during testing I could just set the net to .train(mode=True) and get the output with the same input for multiple runs without updating the network params after each run.
But even when I set the net to .train(), I got the same output during multiple runs with a single image.
Can you explain to me why my approach did not work ?
Using apply is the recommended approach as setting the whole network to train mode will modify the statistics of any batch normalization layers you have in the network. Thanks for showing how to do it with that method!
it only works when you have registered dropout layer during initialization with nn.Dropout layers. In case one uses functional dropout F.dropout(x,training = self.training) in the forwad() method as it is in densenet, such tuning off will not work. The only way to turn on the dropout during evaluation for me currently is to define the forward() method again by replacing the F.dropout(x,training = self.training) with F.dropout(x,training = True).