Hello
I aimed to calculate the jacobian of a tensor (n by m) with respect a tensor (m by d) so i tried this code:
torch.autograd.functional.jacobian(output,W)
where output is the output of my network and gives me the following error
outputs = func(*inputs)
TypeError: ‘Tensor’ object is not callable
would you give me some advice on how can I handle that?
Hello, I have a neural network which has a fully connected (FC) layer with relu activation function before that (i.e. the input of this FC layer comes from a relu activation) if we called this input as ‘x’ and this layer as ‘h’, I like to compute the Jacobean of h with respect to its input x. So, we expect a 2D matrix with partial derivative of dh_j/dx_i. I used ‘javobian’ it gave me a 4D matrix (x and h are a 2D matrices). On the other hand ‘vjp’ gave me a 2D matrix… I’m confused which one is true!!
That was an example of the partial derivative of 1D vector h with respect to 1D vector x, I supposed ‘h’ and ‘x’ are both 1D, but in my problem they are 2D. So shall I use ‘jacobian’? there is a problem with it https://pytorch.org/docs/stable/autograd.html?highlight=jacobian#torch.autograd.functional.jacobian
how can I pass x and h to this function? remember that I want compute the jacobian of h which is a FC layer with respect to its input which is the output of ReLU activation function.
whould you please give me a piece of advice.
thanks in advance