Parameters appear in more than one parameter group. Not sure why?

I am trying to set different learning rates for different sections of my model.
So I have the following code:

print("Net setup")
net = scasnet(3,6)
net_opt = torch.optim.Adam([
    {'params' : net.down},
    {'params' : net.cntx},
    {'params' : net.resc},
    {'params' : net.fine, 'lr':lr*20},
    ], lr=lr)
net_cri = torch.nn.CrossEntropyLoss()
net.cuda()

However I am getting the “some parameters appear in more than one parameter group error”

My network is built as follows:

class scasnet(nn.Module):
    def __init__(self, n_channels, n_classes):
        super(scasnet, self).__init__()

        self.mc1 = multi_conv(n_channels, 64)
        self.pixel_conv1 = multi_conv(64, 64, convs = 1, k=1, p=0)
        self.mxbasic = nn.MaxPool2d(3,stride=2,padding=1)
        self.mc2 = multi_conv(64, 128)
        self.pixel_conv2 = multi_conv(128, 128, convs = 1, k=1, p=0)
        self.mc3 = multi_conv(128, 256, convs=3)
        self.pixel_conv3 = multi_conv(256, 256, convs = 1, k=1, p=0)
        self.mc4 = multi_conv(256, 512, convs=3)
        self.mc5 = multi_conv(512, 512, convs=3, p=2,d=2)
        self.pixel_conv45 = multi_conv(512, 512, convs = 1, k=1, p=0)
        self.mxp5 = nn.MaxPool2d(3,stride=1,padding=1)
        self.c1 = context(512, 512, 6)
        self.c2 = context(512, 512, 12)
        self.c3 = context(512, 512, 18)
        self.c4 = context(512, 512, 24)
        self.rc = res_correction(512,512)
        self.rc4 = res_correction(512,256)
        self.rc3 = res_correction(256,256)
        self.fine_tune = multi_conv(256,256,convs=1)
        self.fine_tune_end = multi_conv(256,n_classes,convs=1, k=1)

        
        ## SETTING PARAMS IN GROUPS
        self.down = []
        self.cntx = []
        self.resc = []
        self.fine = []
        for n,m in self.named_modules():
            if '.' in n:
                continue
            if isinstance(m,context):
                for p in m.parameters():
                    self.cntx.append(p)
            elif isinstance(m,res_correction):
                for p in m.parameters():
                    self.resc.append(p)
            elif 'fine_tune' in n:
                for p in m.parameters():
                    self.fine.append(p)
            else:
                for p in m.parameters():
                    self.down.append(p)

If I remove the

{'params' : net.down}

option (in the net setup), it works. The issue appears no matter what other option is present with this one.
I don’t understand how some parameters could be shared given the nature of the if-else block in the end of my model init block.
Of course I may be setting my parameters in groups incorrectly and if that’s that case I would really appreciate the help.

1 Like

any answer on this? I am having the same problem

i also meet this problem, how do you solve that ?

try checking with get_parameters, which part of model has same parameter.

I am struggling to exclude a section of model so that I can avoid this problem…