Yes, I think this is expected behavior as explained here.
I.e. torch.autograd.grad
expects to find x
in the computation graph, which won’t be there since the output does not have any dependency on x
anymore as it’s a constant.
1 Like