Freezing all the layers except specific layers

I have a model and I trained that on a dataset. But I changed its last two layers and I want to do a fine-tuning. In other words, I want to freeze all the layers of the model except those two layers (self.conv_6 and self.sigmoid)and training the model on that dataset again.

This is my model:

class model(nn.Module):
def __init__(self, pretrained=False):
    super(model, self).__init__()
    
    self.conv_1 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_2 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_3 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_4 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_5 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))
    self.conv_6 = nn.Conv3d(1024, 1024, kernel_size=(3,1,1), stride=(2,1,1), padding=(1,0,0))

    
    self.sigmoid = nn.Sigmoid()
    
def forward(self, x):
    
    
    x = self.conv_1(x)
	x = self.conv_2(x)
    x = self.conv_3(x)
    x = self.conv_4(x)
    x = self.conv_5(x)
	
    x = self.conv_6(x)
    y = self.sigmoid(x) 
    return y

How I can freeze all the layers except those two layers?

Yes, you can freeze all layers via:

for param in model.parameters():
    param.requires_grad = False

and then unfreeze the trainable layer:

for param in model.conv_6.parameters():
    param.requires_grad = True

nn.Sigmoid does not contain any trainable parameters to you cannot freeze or train it.

@ptrblck , Thank you for your answer. In this regard, one question raised for me. Where I should use your lines of codes. Inside the model or in the training code. Also, if I should use in training script, which part of that I should use them?

You can freeze the parameters right after the model creation and before the training.
If needed you can then freeze or unfreeze additional parameters during the training, e.g. after each training iteration.