Higher order derivative - differentiated tensor has not been used in graph


I have this toy example and trying to take the second order derivative. I was able to use .grad to compute the first order derivative, but I was having trouble with the second order derivative. The error was something about one of the differentiate tensors has not been used in the graph. I tried to set allow_unused to True and it returned None. Can someone help explain what is wrong in this toy example? Thank you!

class TOY_NN(torch.nn.Module):

    def __init__(self):
        self.layer1 = torch.nn.Linear(1,3)
        self.layer2 = torch.nn.Linear(3,2)

    def forward(self, x):
        x = self.layer1(x)
        x = self.layer2(x)
        return x

nn = TOY_NN()
x = torch.tensor([3.]).requires_grad_(True)
out = nn(x)
d_out_d_in = torch.autograd.grad(outputs = out, inputs = x, retain_graph = True,
                                     create_graph = True, grad_outputs=torch.ones(out.size()[0]))[0]
d_out_d_in_2 = torch.autograd.grad(outputs = d_out_d_in, inputs = x, retain_graph = True,
                                 create_graph = False, allow_unused=False, grad_outputs=torch.ones(d_out_d_in.size()[0]))[0]