Higher order derivatives

Hello! I have the following code to take higher order derivatives (simplified example):

import torch
from torch.autograd import grad

x = torch.tensor([3.], requires_grad=True)
y = x**4
for i in range(5):   
    print(i,y)
    grads = grad(y, x, create_graph=True)[0]
    y = grads.sum()

This is the output:

0 tensor([81.], grad_fn=<PowBackward0>)
1 tensor(108., grad_fn=<SumBackward0>)
2 tensor(108., grad_fn=<SumBackward0>)
3 tensor(72., grad_fn=<SumBackward0>)
4 tensor(24., grad_fn=<SumBackward0>)

It does what I want, except for the fact that it prints 108 twice. Why does it do that?

Because:

0: f(x) = (x ** 4) = 81
1: f’(x) = 4 * (x ** 3) = 108
2: f’’(x) = 12 * (x ** 2) = 108
3: f’’’(x) = 24 * x = 72
4: f’’’’(x) = 24

3 Likes