Autograd Why accumulation gradient mechanism?

Pytorch autograd why accumulation gradient mechanism?
for what?

Do you have any idea?
Thank you in advance.

Hi,

I’m not sure to understand your question.
If you ask why .backward() accumulates gradients, then this post should give you a good idea: Why do we need to set the gradients manually to zero in pytorch?