nn.marginRankingLoss propagates gradients to both two inputs?

class MarginRankingLoss(_Loss):
    def __init__(self, margin=0, size_average=True, reduce=True):
        super(MarginRankingLoss, self).__init__(size_average, reduce)
        self.margin = margin

    def forward(self, input1, input2, target):
        return F.margin_ranking_loss(input1, input2, target, self.margin, self.size_average,
                                     self.reduce)

I wonder whether both input1 and input2 get gradients and back-propagate them.

Anyone can answer my question?