Visualize param.grad before loss.backward()

When I try to visualize the grad values of my classifier head, through pdb.set_trace() and in shell, it returns nothing. Does this mean that param.grad does not exist at this stage?

pdb.set_trace()
for i, param in enumerate(self._model_and_loss._model.resnet.fc.parameters()):
          if i==1:
                     pass

        loss.backward()

The .grad attribute will be lazily create in the first backward call. Afterwards it would be set to zero via optimizer.zero_grad() or model.zero_grad() and would still be valid.
If set_to_none=True is used in .zero_grad() the attribute will be deleted again, which allows you to save memory and to avoid an unneeded accumulation kernel in the next backward call.

thank you for your reply.
I must have added that I am in the very first iteration. So, in the first iteration and before the loss.backward(), weight.grad should return nothing? am I correct?

Yes, the .grad attribute will be initialized to None before the first backward call.

1 Like