Not passing the parameters to the optimizer = freezing layer?

Hi

Is that equivalent to freezing the layer if I don’t give a certain layer’s parameters to the optimizer?

For example:

ignore_params = list(map(id, (model.fc.parameters())))
train_params = filter(lambda p: id(p) not in ignore_params, model.parameters())

optimizer = torch.optim.SGD([{'params': train_params}], args.lr,
                                              momentum=args.momentum,
                                              weight_decay=args.weight_decay)

Will the parameters in fc layer be frozen?

2 Likes