Split parameters from a model

Is there a way to use split model parameters and use separate optimizers for each parameter set?

I currently have model like:

class Net(nn.Module):
    def __init__(self, arg4, arg3, arg2, arg1=1):
        super(Net, self).__init__()
        self.customout = CustomLayer(args1, args2) 
        self.hidden = torch.nn.Linear(args2, args3) 
        self.predict = torch.nn.Linear(args3, args4) 

    def forward(self, x):
        out = ...
        return out

I would like to split model parameters as:

param1 = {customLayer.weight, customLayer.bias}
param2 = {*.Linear.weight, *.Linear.bias}

So that I can maximize over paramset1 and minimize over paramset2

I tried something like this:

param1 = set()
param2 = set()
for name, m in net.named_parameters():
    if name.lower().startswith('custom'):
        param1 |= set(m)
    else:
        param2 |= set(m)

optimizer1 = torch.optim.Adam(param1, lr=learning_rate)
optimizer2 = torch.optim.Adam(param2, lr=learning_rate)

But this results in error:

    raise ValueError("can't optimize a non-leaf Tensor")
ValueError: can't optimize a non-leaf Tensor

please let me know how this can be addressed

Thank you

Are you seeing the same error, if you pass all parameters to the optimizer?
If so, I guess you might be applying a differentiable operation on (some) parameters inside the CustomLayer, such as to().
Could you post the CustomLayer definition in this case, so that we could have a look?

When using net.parameters() in the optimizer, it works fine.

customlayer defn is just element-wise product, as defined below:

class CustomLayer(nn.Module):
    def __init__(self, args1, args2):
        super(CustomLayer, self).__init__()
        self.weights = nn.Parameter(torch.Tensor(args1, args2))

    def forward(self, x):
        return x * self.weights

Thanks for the update. Could you check which objects are stored in param1 and param2? Are you seeing valid nn.Parameters in them?