Control loading of pre-trained weights into a model with similar architecture

Hi,

I want to know if PyTorch allows us to customise/control the loading of weights from a pre-trained model into a new model?

I am implementing a Bayesian Resnet CNN model (similar to the one mentioned here - https://github.com/kumar-shridhar/PyTorch-BayesianCNN) in PyTorch. In this model, for each weight, instead of having one parameter (the value of the weight), we have two parameters (the mean and standard deviations of the normal distribution defining the weight). Since the rest of the architecture is basically the same, I would like to be able to pre-populate the lower layers using weights from pre-trained frequentist model (setting mean equal to the weights and standard deviation to a small non-zero value).

I want to know if this is possible using PyTorch? If yes, could you point me to an example or a resource that helps me implement it? Thanks.