Hello, can anyone please point out what I do wrong in the computation of the following two functions? Both f1
and f2
should be identical and real-to-real functions and therefore should give the same derivative. However, they give a different output Thanks a lot.
import torch
def f1(x):
return torch.view_as_complex(torch.Tensor([x, 0])).unsqueeze(0).abs()
def f2(x):
return torch.Tensor(x).abs()
x = torch.Tensor([1])
print(torch.autograd.functional.jacobian(f1, x))
print(torch.autograd.functional.jacobian(f2, x))
outputs
tensor([[0.]])
tensor([[1.]])