Hello
I am new to pytorch and I wanna to set different learning rate for some layer’s weights and biases.
here is my code:
with open('DCN.txt', 'r') as Defor_layer:
my_list = Defor_layer.readlines()
params = list(filter(lambda kv: kv[0] in my_list, model.named_parameters()))
base_params = list(filter(lambda kv: kv[0] not in my_list, model.named_parameters()))
if args.optm == "SGD":
optimizer = SGD([
{'params': base_params},
{'params': params, 'lr':1}]
, lr=0.05, momentum=0.9, weight_decay=0.0002)
else:
optimizer = Adam([{'params': base_params},
{'params': params, 'lr':1}]
, lr=0.05, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.0002)
and my params list looks like below, but longer
module.encoder.layers.7.DCNv2.conv_offset_mask.weight
module.encoder.layers.7.DCNv2.conv_offset_mask.bias
module.encoder.layers.8.DCNv2.conv_offset_mask.weight
module.encoder.layers.8.DCNv2.conv_offset_mask.bias
However, I am getting this error and I am not sure what is about:
File “main_FE_Deco_v2_lr.py”, line 191, in train
, lr=0.05, momentum=0.9, weight_decay=0.0002)
File “/home/kh/.conda/envs/torch41/lib/python3.6/site-packages/torch/optim/sgd.py”, line 64, in init
super(SGD, self).init(params, defaults)
File “/home/kh/.conda/envs/torch41/lib/python3.6/site-packages/torch/optim/optimizer.py”, line 43, in init
self.add_param_group(param_group)
File “/home/kh/.conda/envs/torch41/lib/python3.6/site-packages/torch/optim/optimizer.py”, line 191, in add_param_group
"but one of the params is " + torch.typename(param))
TypeError: optimizer can only optimize Tensors, but one of the params is tuple
I appriciate any guide.