How to initiate weights for new layer when loading pretrained model

The clean way would be to derive a custom class from the desired base class, add the dropout layer in its __init__, and change the forward method.
The hacky way might be this post.