Does backward function call forward function?

So I noticed from the following PyTorch code that the loss script has no backward function.

class GeneratorLoss(nn.Module):
    def __init__(self):
        super(GeneratorLoss, self).__init__()

    def forward(self, out_labels, out_images, target_images):

And when the code is executed only the backward function are called:

generator_criterion = GeneratorLoss()
g_loss = generator_criterion(fake_out, fake_img, real_img)

Is the backward a standard function that does forward propagation inside it? Or when will this forward function be called? Other posts related to this is welcome.


For definining custom losses, you just need to define forward function. Backward function is completely computed by Autograd engine.

g_loss = generator_criterion(fake_out, fake_img, real_img)

This line is equal to:

g_loss = generator_criterion.forward(fake_out, fake_img, real_img)

So, explicitly you call forward, and autograd engine will compute backward operation when you can backward in line:


And also, a neural network woks in the way that first you feedforward, then backpropagate errors which in most of the libraries, backward operation has been handled by an automatic engine for computing gradients.


1 Like

Thanks! That was exactly the explanation I needed.