Implicit sum in method autograd.Fonction.backward

I have create a new autograd Fonction with both method forward and backward.
one of my inputs of forward is a scalar and the linked gradient in bacward is a multli dimentional tensor. In general those two tensor has to be the same size but it works and compute the sum over the gradient tensor.
I did not find where in the documentation it says that in this case it will do a sum, and i did not understand the documentation about backward method :
“Computes the sum of gradients of given tensors with respect to graph leaves”.
It is the sum of what here??