How can I update the parameters in certain layers?


Suppose I have a pretrained model with several convolutional layers and a single fc layer.

How can I update the parameters only in the convolutional layers and leave the parameters in the fc layer untouched.

Thank you.

You can solve this by reading the code doc. This resolves this link:

    params (iterable): an iterable of :class:`Variable` s or
        :class:`dict` s. Specifies what Variables should be optimized.

You just need to list all params you want to update.
This is an exaple for you.

optimizer = torch.optim.Adam( {‘params’: model.conv.parameters()} )