grad.grad_fn is None in Hessian calculation

When debugging Hessian calculation in my code I found that it gets an undesired None.
Following is a simplification of the problem. I need non-None values in both cases.

In the following code, grad.grad_fn is not None.

import torch
import torch.nn as nn

x = torch.randn(5)
f = lambda x : x.sum()
x.requires_grad_(True)
with torch.enable_grad():
    fval = f(x)
    print(fval)
    grad, = torch.autograd.grad(fval, x, create_graph=True)
#     grad.requires_grad = True
    print(grad)

However, when replacing f’s definition with the following one we get None

import torch
import torch.nn as nn


x = torch.randn(5)
f = lambda x : x.sum()
x.requires_grad_(True)
with torch.enable_grad():
    fval = f(x)
    grad, = torch.autograd.grad(fval, x, create_graph=True)
    print(grad.grad_fn)

What should I do to make it non-None?