I have a model like this
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.layer_a = nn.Linear(1,1, bias=False)
self.layer_b = nn.Linear(1,1, bias=False)
def forward(self, x):
x_a = self.layer_a(x)
x_b = self.layer_b(x)
return (x_a, x_b)
There are two phases of training
Phase 1: self.layer_a
and self.layer_b
share the same weight.
Phase 2: self.layer_a
is frozen, only self.layer_b
keeps updating.
To achieve Phase 1, I do something like
model = Net()
model.layer_b.weight = model.layer_a.weight
Then both of them share the same weight.
But the problem is how to freeze model.layer_a.weight
without affect model.layer_b.weight
in phase 2?
Or it there a better way to a