Gradient normalization

Hello,

I have a question about the gradient that is computed when we do loss.backward().
Anyone know if that gradient is normalized?

Thank you.

No, PyTorch won’t normalize the gradient automatically behind your back.

Thank you very much!