Torch.autograd.grad error. But this is my first time calling this fucntion

I was trying to call the function torch.autograd.grad to calculate the gradients. However, it returned the following error

RuntimeError: Trying to backward through the graph a second time, 
but the saved intermediate results have already been freed. 
Specify retain_graph=True when calling backward the first time.

Actually, it was my fist time to call it. Any hint to solve this issue?

It was really strange. This error happens when debugging in VS Code at the line below

grad = torch.autograd.grad(loss, self.model.parameters(), allow_unused=True)

However, when running without debugging at that line, there is no error.