Yes, it’s possible to write functions that can be differentiated by autograd, but they need to be written using pytorch functions. So in your case, it would be something like
def tanh(x):
y = torch.exp(-x)
return (1.0 - y) / (1.0 + y)
out = tanh(Variable(torch.rand(1), requires_grad=True))
out.backward() # works