How can I assign forward-adaptive learning rates to modules in sequential?
I know in normal case, we can use optimizer and make this like "params" : model.a.parameters(), 'lr' : lr / 100
, but I wanna use different learning-rate for loss1 and loss2 which went through the same module.
class X(nn.Module):
def __init__(self):
super().__init__()
self.a = nn.Conv2d(1,1,1)
self.b = nn.Conv2d(1,1,1)
def forward(self, input):
return self.b(self.a(input))
model = X()
output1 = model(input1)
output2 = model(input2)
loss1 = F.mse_loss(output1, label1)
loss2 = F.mse_loss(output2, label2)
And for loss1
, I want both a
and b
parameters step-up with lr
, but for loss2
, a
for lr
and b
for -lr
.
module | a | b |
---|---|---|
loss1 | lr | lr |
loss2 | lr | -lr |
because loss1
, loss2
backward through both a
and b
modules, we can’t make this just by (- loss1).backward()
or like that.
can I assign some options when model
forwards?
thanks,