I have a customized autograd mod function
import torch
from torch.autograd import Function, Variable
class ModFunction(Function):
# Note that both forward and backward are @staticmethods
@staticmethod
def forward(ctx, x, y):
ctx.save_for_backward(x, y)
output = torch.remainder(x, y)
return output
@staticmethod
def backward(ctx, grad_output):
x, y = ctx.saved_variables
grad_x = grad_output.clone()
grad_y = (torch.mean(grad_output*( -1 * (x / y).floor()), dim = 0))
return grad_x, grad_y
when i run it on Power 8 machine. torch.remainder throw RuntimeError: the derivative for ‘other’ is not implemented. Please help! Thanks!