I have this complex graph model and I only want to update some layers in it. I know I can use something like “conv.requires_grad=False”, then the gradient of this layer won’t be calculated, but what if I want to update some layers before this layer? Will requires_grad affects them?
Here’s an example:
class model(nn.Module):
def __init__(self):
super(...)
self.conv1 = nn.Conv2d(3,3,1,1,0)
self.conv2 = nn.Conv2d(3,3,1,1,0)
self.conv2.requires_grad = False
If I set conv2’s requires_grad
to False
, can I still be able to update conv1? Thanks.