Hi,
As far as my understanding, the attribute ‘‘requires_grad’’ of a parameter should be True if the parameter needs to be updated. But in my code, I find that a “Conv2d.weight.data.requires_grad” is False.
I just added the conv layer as the normal way: self.conv1 = nn.Conv2d(…)
Is there something wrong with my understanding or with my code?
requires_grad
is always False for the data
attribute. This is a legacy attribute for performing operations without getting them tracked by autograd. You should not use .data
. The recommended ways to achieve the same are ẁith torch.no_grad():
or .detach()
.
Have a look at the following example:
In[1]: x = torch.nn.Conv2d(2, 2, 1)
In[2]: x.weight.requires_grad
Out[2]: True
In[3]: x.weight.data.requires_grad
Out[3]: False
1 Like