While Implementing a custom loss function There is requirement to threshold a tensor and it is necessary for the gradients to flow while .backward() pass with autograd.

I have a tensor of shape (N,7) , need to find out for each row that how many values are greater than a threshold (th), finally I need a tensor of shape (N,1).

Toy Example :

```
In = [[1,2,3,4,5,6,7],
[1,5,4,2,6,11,2],
[0,0,3,4,8,7,11]]
th = 5
then,
Out = [[2],[2],[3]]
```

Currently the problem is that while directly trying to threshold the gradients are vanishing.