Efficiently Get Gradient Of All Neurons wrt Input

I’m looking to get the gradient of every single neuron in a network f wrt the input x. I was wondering if there is an efficient way to do this? One naive approach would be to set x.requires_grad = True and then separately backprop the output of each neuron in a for loop, recording the values of x.grad. However, with such an approach a lot of computation would be repeated. I feel like there should be a much more efficient way to do this.

Any help would be greatly appreciated !