Gradient of an arbitrary function

Hi all, I am a new Pytorch user. I am working on a network structure which does not use the input value directly but first will calculate a function of the input value.

I am doing the simplest case when I give the input to a function and ask PyTorch to calculate the derivative of the function wrt the input. Here is the code,

Blockquote

def get_torch_G2(Rij, Rs, eta):
return Variable(0.5 * torch.exp(-eta * (Rij - Rs).pow(2)) * (torch.cos(np.pi * Rij / 12)+1), requires_grad=True)

dd = torch.tensor([2.2], requires_grad=True)
bb = get_torch_G2(dd, 0, 1)
bb.backward()
dd.grad.clone()

Blockquote

It gives me the error “‘NoneType’ object has no attribute ‘clone’”. Can anyone help me with my problem please?

Jus remove Variable and it will work

import torch
def get_torch_G2(Rij, Rs, eta):
    return 0.5 * torch.exp(-eta * (Rij - Rs).pow(2)) * (torch.cos(3.1416 * Rij / 12)+1)
dd = torch.tensor([2.2], requires_grad=True)
bb = get_torch_G2(dd, 0, 1)
bb.backward()
dd.grad.clone()
1 Like