Hello all! I need to calculate the derivative of a function. Is there a more efficient way to do this than first calculating the Jacobian and then taking the diagonal? A lot of the computation time is wasted since I will throw most of the values away. Here is a minimal working example of what I would like to accomplish more quickly.

```
x = torch.linspace(0,10,11, requires_grad=True)
def f(x):
return x**2
def gradf(x):
return torch.diag(torch.autograd.functional.jacobian(f,x))
dfdx = gradf(x)
```