The problem of calculating second-order derivatice with torch.autograd.grad

It’s obvious that we can calculate high-order derivative with the help of torch.autograd.grad.
But in the following code, the error has occured.

a = torch.randn(2,3).requires_grad_(True)
b = a*3
b = torch.sum(b)
d = torch.autograd.grad(b,a,retain_graph=True,create_graph=True)[0]
d_2 = torch.autograd.grad(d,a)[0]

when calculating d_2, the console reads “RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn”. However, many example codes online have shown that setting create_graph=True will make d a tensor that requires grad. I am really confused.

Hey, please refer to this post to understand why. Feel free to add follow up questions there.

1 Like

Thank you very much!

I am really sorry about the comments I leaved in the posts.(My original comment has been modified by the administrator and the part involving personal comments is now invisible.)
My original intention is absolutely to express my gratitude and thankfulness. As a beginner in Pytorch and also in western culture, I was enlightened by the textbook about the habit of compliment and praise which is positive in the western culture. However, I fail to pay attention to the etiquette of avoiding personal comments in the context of a techinal forum. If anyone has felt uncomfortable about my comments, please forgive my rudeness and thoughtlessness.