I added following lines to imagenet example, using pretrained model of resnet18.
for param in model.parameters():
param.requires_grad = False
# Replace the last fully-connected layer
# Parameters of newly constructed modules have requires_grad=True by default
model.fc = torch.nn.Linear(512, 3)
optimizer = torch.optim.SGD(model.fc.parameters(), args.lr,
momentum=args.momentum,
weight_decay=args.weight_decay)
But then I have following error:
File "main.py", line 234, in train
loss.backward()
File "/usr/local/lib/python2.7/dist-packages/torch/autograd/variable.py", line 146, in backward
self._execution_engine.run_backward((self,), (gradient,), retain_variables)
RuntimeError: there are no graph nodes that require computing gradients
I would like to freeze all parameters of original ResNet18 and just learn the last layer with 3 classes. How I should do this correctly? Based on information from the forum, this should we the working version.