Newbie autograd question

Is there a way to create a callable gradient function? Take for example the following tutorial from Autograd:

import autograd.numpy as np 
from autograd import grad 

def tanh(x):              
     y = np.exp(-x)
     return (1.0 - y) / (1.0 + y)

grad_tanh = grad(tanh)
grad_tanh(1.0)

It is not clear how one can do this using the grad function in PyTorch.

Thanks!

Tanh is already implemented, as most of usual functions. You can directly use torch.tanh(your_tensor).

Yes, it’s possible to write functions that can be differentiated by autograd, but they need to be written using pytorch functions. So in your case, it would be something like

def tanh(x):
    y = torch.exp(-x)
    return (1.0 - y) / (1.0 + y)

out = tanh(Variable(torch.rand(1), requires_grad=True))
out.backward()  # works