How can I get the gradients of the weights of layer?

outputs = net(inputs)
loss = criterion(outputs, targets)
loss.backward()
for name, value in net.named_parameters():
     print(name)

parts of result of print as following:

cls_net.conv1.weight
cls_net.bn1.weight
cls_net.bn1.bias
cls_net.layer1.0.conv1.weight
cls_net.layer1.0.bn1.weight
cls_net.layer1.0.bn1.bias
cls_net.layer1.0.conv2.weight
cls_net.layer1.0.bn2.weight
cls_net.layer1.0.bn2.bias
cls_net.layer1.0.conv3.weight
cls_net.layer1.0.bn3.weight
cls_net.layer1.0.bn3.bias
cls_net.layer1.0.downsample.0.weight
cls_net.layer1.0.downsample.1.weight
cls_net.layer1.0.downsample.1.bias
cls_net.layer1.1.conv1.weight
cls_net.layer1.1.bn1.weight
cls_net.layer1.1.bn1.bias
cls_net.layer1.1.conv2.weight
cls_net.layer1.1.bn2.weight
cls_net.layer1.1.bn2.bias

and now I want to print grad of cls_net.layer1.1.bn2.weight
print('------->grad ',net.cls_net.layer1.1.bn2.weight.grad)
but I got an error:

Original exception was:
  File "train_autonet.py", line 109
    print('------->grad ',net.cls_net.layer1.1.bn2.weight.grad)
                                         ^
SyntaxError: invalid syntax

It seems a python error.

or how can I print the gradients of weights of all layers?

do you mean something like this,

for name, param in net.named_parameters():
  print(name, param.grad)
1 Like

Thank you!It solved my problem,hah