How to calculate 2nd derivative of a likelihood function

For example, I used his blog to try to get the 2nd derivative [Second order derivatives and inplace gradient "zeroing" ], but it turns out that the grd.grad information is None. Can anyone give me some suggestions?

import torch
from torch import Tensor
from torch.autograd import Variable
from torch.autograd import grad
from torch import nn

# some toy data
x = Variable(Tensor([4., 2.]), requires_grad=False)
y = Variable(Tensor([1.]), requires_grad=False)

# linear model and squared difference loss
model = nn.Linear(2, 1)
loss = torch.sum((y - model(x))**2)

optimizer = torch.optim.Adam(model.parameters(), lr=1e-2)

# instead of using loss.backward(), use torch.autograd.grad() to compute gradients
loss_grads = grad(loss, model.parameters(), create_graph=True)

gn2 = sum([grd.norm()**2 for grd in loss_grads]) # 2nd derive
print(‘loss %f grad norm %f’ % (loss.data, gn2.data))
model.zero_grad()
gn2.backward()
optimizer.step()

for grd in loss_grads:
print grd.grad

The answer is None.