Error in setting custom lr for different parameters

Hello
I am new to pytorch and I wanna to set different learning rate for some layer’s weights and biases.
here is my code:

        with open('DCN.txt', 'r') as Defor_layer:
            my_list = Defor_layer.readlines()

        params = list(filter(lambda kv: kv[0] in my_list, model.named_parameters()))
        base_params = list(filter(lambda kv: kv[0] not in my_list, model.named_parameters()))
      
        if args.optm == "SGD":
            optimizer = SGD([
                  {'params': base_params},
                  {'params': params, 'lr':1}]
                  , lr=0.05, momentum=0.9, weight_decay=0.0002)
        else:
            optimizer = Adam([{'params': base_params},
                              {'params': params, 'lr':1}]
                            , lr=0.05, betas=(0.9, 0.999), eps=1e-08, weight_decay=0.0002)

and my params list looks like below, but longer

module.encoder.layers.7.DCNv2.conv_offset_mask.weight
module.encoder.layers.7.DCNv2.conv_offset_mask.bias
module.encoder.layers.8.DCNv2.conv_offset_mask.weight
module.encoder.layers.8.DCNv2.conv_offset_mask.bias

However, I am getting this error and I am not sure what is about:
File “main_FE_Deco_v2_lr.py”, line 191, in train
, lr=0.05, momentum=0.9, weight_decay=0.0002)
File “/home/kh/.conda/envs/torch41/lib/python3.6/site-packages/torch/optim/sgd.py”, line 64, in init
super(SGD, self).init(params, defaults)
File “/home/kh/.conda/envs/torch41/lib/python3.6/site-packages/torch/optim/optimizer.py”, line 43, in init
self.add_param_group(param_group)
File “/home/kh/.conda/envs/torch41/lib/python3.6/site-packages/torch/optim/optimizer.py”, line 191, in add_param_group
"but one of the params is " + torch.typename(param))
TypeError: optimizer can only optimize Tensors, but one of the params is tuple

I appriciate any guide.

1 Like

I realized what is the problem.
I have name and data in the params and that’s why the I got the error say one of the params is tuple. all I had to is to send only the data as params and not the names along them.

2 Likes

Could you please show how to send only the data as params? I have the same error as yours.

1 Like

Hi @luciaL before initializing the optimizer, adding the following two lines to obtain the params from the tuples in the two lists resolves the issue.
params = [i[1]for i in params]
base_params = [i[1]for i in base_params]

1 Like