Higher derivatives

Consider function f(x, y) = sin(x) sinh(x), we can easily show: f_xx(x,y) + f_yy(x,y) == 0
How come then this code gives me something else

import torch
N = 10  # number of points
p = torch.rand(10, 2).requires_grad_()  #  `x` and `y` in the example above are the first and second columns of p.
f = torch.sin(p[:, 0]) * torch.sinh(p[:, 1])
df_dp = torch.autograd.grad(f, p, grad_outputs=torch.ones_like(f), create_graph=True, only_inputs=True)[0]
df_dpp = torch.autograd.grad(df_dp, p, grad_outputs=torch.ones_like(df_dp), create_graph=True, only_inputs=True)[0]
print(df_dpp.sum(dim=1))

I can confirm df_dp columns contain the true _x and _y gradients.