Mu not getting gradient

the mu.grad.item() consisteny gets zero, is this normal? here is the code:

import torch
mu = torch.zeros(2, requires_grad=True)
sigma = 1.0
eps = torch.randn_like(mu)
sampled = mu + sigma * eps
logp = -((sampled - mu) ** 2) / (2 * sigma ** 2) - torch.log(torch.tensor(sigma)) - 0.5 * torch.log(torch.tensor(2 * torch.pi))
loss = -logp.sum()
loss.backward()
print("loss:", loss)
print("eps:", eps)
print("mu.grad:", mu.grad)

mu + sigma * eps - mu will cancel mu