Is the backward a standard function that does forward propagation inside it? Or when will this forward function be called? Other posts related to this is welcome.
So, explicitly you call forward, and autograd engine will compute backward operation when you can backward in line:
g_loss.backward()
And also, a neural network woks in the way that first you feedforward, then backpropagate errors which in most of the libraries, backward operation has been handled by an automatic engine for computing gradients.