# Jacobian for every layer

is there an efficient way of getting jacobian of every layer in pytorch.
for eg. if i have x=nn.conv2d(y), i would like to see the local gradient dx/dy and similarly so for all the layers in the network.

You mean `dx/dy` given you formula right?

It depends what you want it for.
If it is only for debugging, you can compute the Jacobian for your layer like you would do for any other function using things like this.
If you want it efficiently, you can write down the formula easily in general and just implement it.

Note that the autograd never computes full Jacobian and so the formula we have can only do vector Jacobian product.