Hi, I have a problem about autograd. The code is below:
from torch import nn
input = torch.randn(8, 3, 50, 100)
net = nn.Sequential(nn.Conv2d(3, 16, 3, 1),nn.Conv2d(16, 32, 3, 1))
net.named_parameters().__next__()[1].requires_grad = False
for param in net.named_parameters():
print(param[0], param[1].requires_grad)
output = net(input)
net.named_parameters().__next__()[1].requires_grad = True
torch.mean(output).backward()
for param in net.named_parameters():
param[0], param[1].grad
then part of output is listed
0.weight False
0.bias True
1.weight True
1.bias True
('0.weight', None)
I set requires_grad() False for 0.weight, then I set requires_grad() True before backward. But I still get None for 0.weight grad. I’m wondering what step in forward cause this problem.