How will the backward pass be affected by applying conditionals on the output?

Hi,
I have a network which is outputing coordinate pairs that is used to update a tensor that is then used for calculation.

x, y - > output tensors from a torch model that are differentiable for autograd
mask = torch.zeros(size)
for i in range(something):
    if(x[i]>x[i+1] and <conditionals based on x and y>):
        mask[i] = 1.

I have a loss that is calculated on the basis of this mask. But since the mask is freshly created it is detached from the computational graph.
Is there a way to work around this issue or is this fully non-differentiable?
Thanks

Do you actually want the mask to be differentiable w.r.t x?
If not, you can wrap the mask generation with torch.no_grad().

Otherwise, the binary ‘compare’ operator is not differentiable, even when you think about it mathematically.
Maybe you can provide more details on what you are trying to achieve?

My plan was to calculate a loss function utilizing the mask that is generated by this code. But since the mask is the only piece from the model being used in the loss, there was no updates being made to the model parameters due to the mask being detached from the graph. The loss is a function of this mask and the input.