I’m trying to compute the gradient of 1/x without using Pytorch’s autograd. I use the formula grad(1/x, x) = -1/x**2 but when I compare my result with the gradient given by Pytorch’s autograd, they’re different.
Here is my code
a = torch.tensor(np.random.randn(), dtype=dtype, requires_grad=True)
loss = 1/a
loss.backward()
print(a.grad - (-1/(a**2)))