Strange gradient when writing the backward function

Hello,
I have tried to write a function with a gradient backward function such as the example shown in the following. However, the gradient for one of the variable is zero. In the following example, the grad_output is zero and grad_output1 is fine. If I change output1 = output+0.1, then the gradient is fine. Is this gradient behavior normal? If a new variable is the same as the other, then the earlier variable is zero?

class cal(Function):
    
    def forward(self, input, weight):
        self.save_for_backward(input, target)
        output = input*weight
        output1 = output
        return output, output1
    
    def backward(self, grad_output,grad_output1):
        .....gradient calculation
        return grad_input, grad_weight

Thanks.

is this a typo in your code?

To your question, if any of the output variables are not used in any other calculation further, their gradient would be 0.

Yes. Thanks. It is typo. I have changed it.
Okay. Thanks a lot.