Which layer(s) is(are) being trained with new weights during this transfer learning?

Hello!
I am using Squeezenet 1_1 Model pretrained model to train a custom dataset with 4 classes. I used the following commands to instantiate the model

MODEL = models.squeezenet1_1(pretrained=True)
MODEL.classifier[1] = nn.Conv2d(512, self.num_classes, kernel_size=(1, 1), stride=(1, 1))

Then, I write my training script, which is quite boilerplate. I do not freeze any layers.

So, is only the Conv2d layer getting trained with new weights? Do the MaxPool2d layers in Squeezenet have the old weights? Is there a command which we can use that gives the weights of the entire net before and after the training process?

Since you didn’t freeze any parameters, all should be trained as long as you pass them to the optimizers.
Pooling layers do not have any parameters.

You can check the updated parameters by creating a deepcopy of the state_dict before training and compare it to the state_dict after training:

state_dict_reference = copy.deepcopy(model.state_dict())

# training
# ...

# compare
state_dict = model.state_dict()
for key in state_dict:
    print(key, (state_dict_reference[key] - state_dict[key]).abs().max())
1 Like