Nan in high order derivatives

I would like to calculate high order derivatives in Pytorch, but the results are nan.
Here is my code.

X = torch.tensor([[[0.,0.,0.],[1.,0.,0.],[3.,0.,0.]]],requires_grad=True)
adjacency_matrix = (X.unsqueeze(2)-X.unsqueeze(1)).norm(dim=3)
Y = torch.matmul(adjacency_matrix,adjacency_matrix)
grad1 = torch.autograd.grad(Y.sum(),X,create_graph=True)[0]
grad2 = torch.autograd.grad(grad1.sum(),X,create_graph=True)[0]
print(grad2)

The output grad2 is all nan. Could you please tell me how to get the correct result´╝č