How can I add dropout layers after every convolution layer in DenseNet201 pretrained if I want to keep the value of its parameters (weights/Biases)? (FYI, I wanted to add dropout layers between the convolutional layers in order to quantify MC-Dropout uncertainty during prediction).
You could try to replace each original conv layer with a new
nn.Sequential layer containing the new dropout layer as well as the pre-trained conv layer.
Something like this should work:
model = models.densenet201() # replace conv layer old_conv = model.features model.features = nn.Sequential( nn.Dropout(), old_conv)
Note that you are changing the model architecture and thus won’t be able to load a pre-trained
state_dict to this manipulated model anymore so make sure to load the
state_dict before adding the dropout layers or store the new
Shouldn’t dropout be added to the dense layer as well? Would it be better to put dropout before or after Conv2d?
Could you also show me how to load the state_dict before adding dropout layers?
You can add the dropout layers wherever you think it would work and my code snippet is just one example how to add it before one conv layer.
If you want to use the pretrained model from
model = models.densenet201(pretrained=True)
otherwise if you want to load a
state_dict that you’ve stored after training the model load it via:
model = models.densenet201() model.load_state_dict(torch.load(path_to_state_dict))
before changing the architecture.