ValueError: some parameters appear in more than one parameter group

I’m using torch 1.4.0. Here is my error trace:

  File "G:\My Drive\Debvrat_Research\Codes\Edge Detection\MultiScale-thickness\train.py", line 99, in <module>
    main()

  File "G:\My Drive\Debvrat_Research\Codes\Edge Detection\MultiScale-thickness\train.py", line 89, in main
    trainer=Trainer(args,net, train_loader=train_loader)

  File "G:\My Drive\Debvrat_Research\Codes\Edge Detection\MultiScale-thickness\modules\trainer.py", line 87, in __init__
    self.optimizer = torch.optim.SGD(tuned_lrs, lr=args.lr, momentum=args.momentum, weight_decay=args.weight_decay)

  File "C:\Anaconda3\lib\site-packages\torch\optim\sgd.py", line 64, in __init__
    super(SGD, self).__init__(params, defaults)

  File "C:\Anaconda3\lib\site-packages\torch\optim\optimizer.py", line 51, in __init__
    self.add_param_group(param_group)

  File "C:\Anaconda3\lib\site-packages\torch\optim\optimizer.py", line 216, in add_param_group
    raise ValueError("some parameters appear in more than one parameter group")

ValueError: some parameters appear in more than one parameter group

The error is thrown here:

 if not param_set.isdisjoint(set(param_group['params'])):
            raise ValueError("some parameters appear in more than one parameter group")

My param_set is coming out to be empty ({}). I neither understand the reason behind it, nor how to resolve about this. Any pointers would be useful.

Thanks!

1 Like

maybe you can check the Parameter definition function like:
`def get_1x_lr_params(model):
β€œβ€"
This generator returns all the parameters for conv and two fc layers of the net.
β€œβ€"
b = [model.conv1, model.conv2, model.conv3a, model.conv3b, model.conv4a, model.conv4b,
model.conv5a, model.conv5b, model.fc6, model.fc7]
for i in range(len(b)):
for k in b[i].parameters():
if k.requires_grad:
yield k

def get_10x_lr_params(model):
β€œβ€"
This generator returns all the parameters for the last fc layer of the net.
β€œβ€"
# b = [model.fc7]
b = [model.fc8]
for j in range(len(b)):
for k in b[j].parameters():
if k.requires_grad:
yield k
`

Make sure there are no duplicate layers