I am trying to find the accuracy of a regression model with a 10% error margin. So what I am doing to calculate accuracy is this:
abs(model output - target) <= 0.1*target
how can I do that in tensors?
I subtracted the two tensors and got the absolute value using torch.abs and torch.sub() now I want to count how many elements are under the 10% margin. How do I do that?