Transfer learning error


from torchvision import models
model = models.resnet18(pretrained = True)
mean = [0.485,0.456,0.406]
std = [0.229,0.224,0.225]
for parms in model.parameters():
    parms.requires_grad = False
model.fc = torch.nn.Linear(512,8)
optimzer = torch.optim.Adam([parms for parmas in model.parameters() if (parms.requires_grad == True)],lr = 0.05 )

This code is giving me error like this I don’t understand where is mistake.

ValueError                                Traceback (most recent call last)
<ipython-input-18-c701a2bba695> in <module>
      6     parms.requires_grad = False
      7 model.fc = torch.nn.Linear(512,8)
----> 8 optimzer = torch.optim.Adam([parms for parmas in model.parameters() if (parms.requires_grad == True)],lr = 0.05 )

C:\Miniconda\envs\py37_pytorch\lib\site-packages\torch\optim\adam.py in __init__(self, params, lr, betas, eps, weight_decay, amsgrad)
     42         defaults = dict(lr=lr, betas=betas, eps=eps,
     43                         weight_decay=weight_decay, amsgrad=amsgrad)
---> 44         super(Adam, self).__init__(params, defaults)
     45 
     46     def __setstate__(self, state):

C:\Miniconda\envs\py37_pytorch\lib\site-packages\torch\optim\optimizer.py in __init__(self, params, defaults)
     44         param_groups = list(params)
     45         if len(param_groups) == 0:
---> 46             raise ValueError("optimizer got an empty parameter list")
     47         if not isinstance(param_groups[0], dict):
     48             param_groups = [{'params': param_groups}]

ValueError: optimizer got an empty parameter list

Hi,

This might be funny but you have typo in your code.

parms is from loop of original model that you turned its grad off. parmas is the variable you defined in loop within optim.Adam method to get params with required grads.

Bests

:man_facepalming::man_facepalming::man_facepalming:.
Thank you very much