How can I disable all layers gradient expect the last layer in Pytorch?

Your code works.
After running your code snippets, you can print the requires_grad attributes:

for name, param in resnet18.named_parameters():
    print(name, param.requires_grad)

which shows, that fc.weight and fc.bias both require the gradient.
You will also get a valid gradients in these layers:

resnet18(torch.randn(1, 3, 224, 224)).mean().backward()
for name, param in resnet18.named_parameters():
    print(name, param.grad)
3 Likes