Mean Directional Accuracy criterion loss function

Hello, can someone help me create a mean directional accuracy loss function for pytorch?
in numpy it should be this: https://gist.github.com/bshishov/5dc237f59f019b26145648e2124ca1c9
def mda(actual: np.ndarray, predicted: np.ndarray):
“”" Mean Directional Accuracy “”"
return np.mean((np.sign(actual[1:] - actual[:-1]) == np.sign(predicted[1:] - predicted[:-1])).astype(int))

but i get errors if use it as loss criterion in pytorch

You cannot have sign in the loss function and expect good results and int won’t harmonize with autograd either.

the result of the function there is the mean as float (not an int)

You cannot have sign in the loss function and expect good results

i think i understand the reason but could that be “added” to the result of a normal mse to give importance also to the direction?

Probably multiplying would work better.

And I might add, it could be a good idea to take, say 3-5 elements of your prediction and target vector and visualize the loss you’re getting.