How pytorch "grad_output" in backward function compute

I am a begineer in pytorch. I have a question about “grad_output” is how to compute by mse loss in pytorch.In my case, I define a new layer as network output layer , and also my own forward and backward function but don’t have network parameters, i just want to do use “grad_output” to do some work . but i don’t known grad_output is how to compute by mse loss . Can you give me some advice about this question ?
Thank you !

class MyFun(torch.autograd.Function):
        def forward(self, inp):
              return inp
        def backward(self, grad_output):
             "
                use grad_output
             ""
              return grad
loss = torch.sum((y_pred - y).pow(2))
loss.backward()
type or paste code here