Hi all.
I want to stop back-propagation in some layers.
For example, in the below network I tried to apply back-prop only up to model_2 and not to model_1.
class Model(nn.Module):
def __init__(self):
super(Model, self).__init__()
self.model_1 = nn.Sequential(
nn.Conv2d(in_channels=3, out_channels=32, kernel_size=5, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2),
)
self.model_2 = nn.Sequential(
nn.Conv2d(in_channels=32, out_channels=48, kernel_size=5, padding=2),
nn.ReLU(),
nn.MaxPool2d(kernel_size=2)
)
def forward(self, x):
x = self.model_1(x)
x = self.model_2(x)
x = x.view(-1, output_size)
return x
I have searched about this and I have a idea.
: Seperate Model() to Model1() and Model2() and then apply with requires_grad.
Is it right to do what I want to do ?
model1 = Model1()
model2 = Model2()
for param in model1.parameters():
param.requires_grad = False
optimizer = optim.SGD(model2.parameters(), lr =0.01, momentum=0.9)
and no optimizer for model1