What does backward() do for smooth_l1_loss

Hi All,

I was working with the regression example from the examples page for pytorch and I wanted to know how does the backward function handle grad_fn=.

Regards

Basically I wanted to know how to calculate the backward of smooth_l1_loss