Does "parameter.grad" is the gradient*lr or just gradient

Hi, I want to measure the gradient norm, I have a working code, I just want to interpret it :

        for m,p in enumerate(model.parameters()):
            param_norm =
            total_norm += param_norm.item()

Now I wonder does p.grad is the already multiplied by the learning rate or the pure gradient?

Thanks a lot!

No, it is just a gradient itself :blush:.

1 Like