What is the meaning of grad_output?

The code snippet is below.

import torch
from torch.autograd.function import Function

class MyCalc(Function):
    @staticmethod
    def forward(ctx, x):
        res = x * x + 2 * x
        ctx.res = res
        
        return res

    @staticmethod
    def backward(ctx, grad_output):
        print(grad_output)

        return grad_output

x = torch.tensor([2.], requires_grad=True)
res = MyCalc.apply(x)
print(res)

res.backward()
print(x.grad)

print(grad_output)=1, I want to know what it means.

Suppose we have two sequential layers.
Then grad_output is the gradient of the second layer output with respect to the input of itself (first layer output).
in your case , you calling backward on a scaler (a tensor of one item) , and buy default grad is set to 1 for scaler output.

Thank you.
I get it.