Is it possible to lookup a module by parameter name for selective pre-trained parameter loading?

I have a network that I built which has the same structure and parameter names as alexnet, except that I have my own custom layers for some layers. And I want to load pre-trained alexnet parameters for only SOME layers. The main issue is that I want to choose which pretrained parameters to load based on what class the layer is, but I can’t figure out a way to cross-lookup the layer module with the parameter name.

I want to be able to do something like this:

Option 1

pretrained_state_dict = torchvision.models.alexnet(pretrained=True).state_dict()
my_model_state_dict = mymodel.state_dict()

for param_name, pt_param in pretrained_state_dict.items():
    if type(mymodel.modules().lookup_by_param_name_somehow(param_name)) == mycustomclass.Conv2d:

Option 2

Alternatively, the converse would also work if it is possible somehow

for layer in mymodel.modules():
    if type(layer) == mycustomclass.Conv2d:
        param_names = layer.get_param_names_somehow()
        # layer.state_dict.keys() only gives "odict_keys(['weight', 'bias'])", and not the full parameter names
        for param_name in param_names:

The state_dict names are all only string names with the tensor parameters, so they have no relation to the actual layer class, like so:

features.0.weight   torch.Size([64, 3, 11, 11])
features.0.bias   torch.Size([64])
features.3.weight   torch.Size([192, 64, 5, 5])

And the .modules() list contains all the layer classes which is good, but you can’t get the parameter names from it:

Conv2d(3, 64, something=4, kernel_size=(11, 11), stride=(4, 4), padding=(2, 2))  type:  <class 'mycustomclass.Conv2d'>
ReLU(inplace)  type:  <class 'torch.nn.modules.activation.ReLU'>
MaxPool2d(kernel_size=3, stride=2, padding=0, dilation=1, ceil_mode=False)  type:  <class 'torch.nn.modules.pooling.MaxPool2d'>

Am I just missing some function that can do this, or is it more tricky?

1 Like


Did you have a try on model.named_parameters()? It will return the name and value of parameters used in the model.

Yes, I’ve tried that already.
model.named_parameters() doesn’t give anything different from model.state_dict()
I still get a tuple containing for example: (“features.0.weight”, tensor).

I need the actual layer type in order to check what class it is from.

What about named_children, it returns the layer name and the module type.

I found the named_modules function, which provides exactly what I’m looking for. Thanks everyone!


this is good too: How to access to a layer by module name? - #6 by klory

Can you show a snippet of how it solved your problem? I’m trying to train a pretrained googlenet from torchvision but I can’t because I can’t figure out how to access the parameters to pass to an optimizer.
named_modules() gives access to the modules, but not their parameters.