Iβm using torch 1.4.0
. Here is my error trace:
File "G:\My Drive\Debvrat_Research\Codes\Edge Detection\MultiScale-thickness\train.py", line 99, in <module>
main()
File "G:\My Drive\Debvrat_Research\Codes\Edge Detection\MultiScale-thickness\train.py", line 89, in main
trainer=Trainer(args,net, train_loader=train_loader)
File "G:\My Drive\Debvrat_Research\Codes\Edge Detection\MultiScale-thickness\modules\trainer.py", line 87, in __init__
self.optimizer = torch.optim.SGD(tuned_lrs, lr=args.lr, momentum=args.momentum, weight_decay=args.weight_decay)
File "C:\Anaconda3\lib\site-packages\torch\optim\sgd.py", line 64, in __init__
super(SGD, self).__init__(params, defaults)
File "C:\Anaconda3\lib\site-packages\torch\optim\optimizer.py", line 51, in __init__
self.add_param_group(param_group)
File "C:\Anaconda3\lib\site-packages\torch\optim\optimizer.py", line 216, in add_param_group
raise ValueError("some parameters appear in more than one parameter group")
ValueError: some parameters appear in more than one parameter group
The error is thrown here:
if not param_set.isdisjoint(set(param_group['params'])):
raise ValueError("some parameters appear in more than one parameter group")
My param_set
is coming out to be empty ({}). I neither understand the reason behind it, nor how to resolve about this. Any pointers would be useful.
Thanks!