What Param.requires_grad_ do

Hi,
I will appreciate your help in clarifying my question.
What is the difference between Para.requires_grad_ =false
and Param.reruires_grad=false (without under score).
Each one gives me different results. Requires_grad reduces the training time while the other one doesn’t.

for param in model.parameters():
    param.requires_grad_=True

See here: https://github.com/pytorch/pytorch/blob/d0ad848aa55aae7afdbde22d7cbbf55045f4b26b/torch/nn/modules/module.py#L2420

From the looks of it, that just deals with modules that contain multiple separate parameters. It only gets called under certain circumstances and just sets .requires_grad for all of the individual parameters.

Here is some sample code to demonstrate how you might use .requires_grad_:

import torch.nn as nn

class TestModel(nn.Module):
    def __init__(self):
        super().__init__()

        self.layer1 = nn.Linear(5, 10)
        self.layer2 = nn.Linear(10, 20)
        self.layer3 = nn.Linear(20, 1)

    def forward(self, x):
        x = self.layer1(x)
        x = self.layer2(x)
        return self.layer3(x)


model = TestModel()

model.requires_grad_(False)

for param in model.parameters():
    print(param.requires_grad)

Note, that one call set all of the layers to requires_grad = False.

1 Like

Thank you for your reply .