For one of my tasks, I am required to compute a forward derivative of output (not loss function) w.r.t given input X. Mathematically, It would look like this:

Which is essential a Jacobian of the output. It is different from backpropagation in two ways. First, we want derivative of network output not the loss function. Second, It is calculated w.r.t to input X rather than network parameters. I think this can be achieved in Tensorflow using `tf.gradients()`

. How do I perform this op in PyTorch? I am not sure if I can use `backward()`

function here.

Thanks