Is there a way to use split model parameters and use separate optimizers for each parameter set?
I currently have model like:
class Net(nn.Module):
def __init__(self, arg4, arg3, arg2, arg1=1):
super(Net, self).__init__()
self.customout = CustomLayer(args1, args2)
self.hidden = torch.nn.Linear(args2, args3)
self.predict = torch.nn.Linear(args3, args4)
def forward(self, x):
out = ...
return out
I would like to split model parameters as:
param1 = {customLayer.weight, customLayer.bias}
param2 = {*.Linear.weight, *.Linear.bias}
So that I can maximize over paramset1 and minimize over paramset2
I tried something like this:
param1 = set()
param2 = set()
for name, m in net.named_parameters():
if name.lower().startswith('custom'):
param1 |= set(m)
else:
param2 |= set(m)
optimizer1 = torch.optim.Adam(param1, lr=learning_rate)
optimizer2 = torch.optim.Adam(param2, lr=learning_rate)
But this results in error:
raise ValueError("can't optimize a non-leaf Tensor")
ValueError: can't optimize a non-leaf Tensor
please let me know how this can be addressed
Thank you