Hi all, I tried to build a custom loss function,and found that if using torch.where() in loss operation the error " Element 0 of tensors does not require grad and does not have a grad_fn" raised when I start training the model.And if I just remove torch.where() the training of model works.

```
class Custom_loss(nn.Module):
def __init__(self):
super().__init__()
def forward(self, y_hat, y):
prob = torch.abs((y_hat - y)/y)
t = torch.ones_like(prob)
f = torch.zeros_like(prob)
prob = torch.where(prob < 0.1, t, f)
return torch.sum(prob)
```

Is there a way to get around it?

Thanks!