Assume I have a network in two parts like:
def forward(self, x):
x = self.part_1(x)
return self.part_2(x)
If I want to update only the second part in training phase, will this modification do the desired thing?
def forward(self, x):
with torch.no_grad():
x = self.part_1(x)
return self.part_2(x)