What is the best way to check if parameters already in optimizer?

Hi, I am freezing and un-freezing parameters of a model during training, so I want check if some parameters are already in my optimizer (so that I won’t put them again - when I do a get ValueError: some parameters appear in more than one parameter group).

I have tried the following:

... 
        for prm in self.model.parameters():
            prm.requires_grad = True
            if id(prm) not in list(self.optimizer.state_dict()['state'].keys()):
                new_trainable_parameters.append(prm)

        if new_trainable_parameters != []:
            my_dict = {'params': new_trainable_parameters}
            self.optimizer.add_param_group(my_dict)

but it is not working (I still get the same error).

Any ideas? Thanks in advance.

Hey Katerina,

Wondering have you found an answer to your question?

Hi,

The taken approach is complex and not necessary really.
The idea is that when you freeze a parameter using requires_grad=False, even you pass entire parameters to the optimizer, it will not consider them and literally just skip those parameters in the way that you have never provided them and this will be done by not accumulating gradients for that particular parameter which also leads to have None as the gradient.

Bests

2 Likes