How to have two optimizers such that one optimizer trains the whole parameter and the other trains partial of the parameter?

Hi! I am new to PyTorch.
I have a model:

class MyModel(nn.Module):
    def __init__(self):
        super(MyModel, self).__init__()
        self.conv1 = nn.Conv2d(128, 128, (3,3))
        self.conv2 = nn.Conv2d(128, 256, (3,3))
        self.conv3 = nn.Conv2d(256, 256, (3,3))
                
    def forward(self,):
        x = F.relu(self.conv1(x))
        x = F.relu(self.conv2(x))
        x = F.relu(self.conv3(x))
        return x

model = MyModel()

I want to train model in such a way that in every training step DATA_X1 should train
['conv1', 'conv2', 'conv3'] layers and DATA_X2 should train only ['conv3'] layers.
Is there a way I can do this? Any help will be appreciated.

Hello,

The following topic (Two optimizers for one model) discussed what you can do with two optimizers. You only have to specify the correct parameters to each optimizers. For example:

optim1 = torch.optim.SGD(model.parameters(), lr=0.001)
optim2 = torch.optim.Adam(model.conv3.parameters(), lr=0.05)

Good luck!