Transfer learning weight updating

I was trying to use resnet to do transfer learning. Basically I tried to just keep the last fc layer trainable and keep the bottlenecks the same and what I did was

        vision = torchvision.models.resnet50(pretrained=True)
        for param in vision.parameters():
            param.require_grid = False
        visionl.fc = nn.Linear(x, y)

However, it appears that the weights of the layers in the model is still changing. Did I do something wrong?

You just misspell “requires_grad” to “require_grid”.