Warm-up: numpy explained

Hi, I am new to pythorch and I am going through the tutorials. I find them useful but, in my opinion, they would be slightly better if they would explain them line by line. Maybe it is only my problem though.

After finishing the 60 minute bliz I am doing the " LEARNING PYTORCH WITH EXAMPLES" ones. I am starting from this one and I would like to understand better the following lines of code. Why do we do them?

I don’t understand why we need to do all the following to compute the backpropagation

    # Backprop to compute gradients of w1 and w2 with respect to loss
    grad_y_pred = 2.0 * (y_pred - y)
    grad_w2 = h_relu.T.dot(grad_y_pred)
    grad_h_relu = grad_y_pred.dot(w2.T)
    grad_h = grad_h_relu.copy()
    grad_h[h < 0] = 0
    grad_w1 = x.T.dot(grad_h)

Thanks In advance!