How to compute gradient of 2D tensor with respect to 2D tensor?

For example, I have a 2D tensor Y of size MxN and another 2D tensor X of size Mx3, while N>3.

I want to compute the gradient of Y with respect to X.

I can compute as follows:

for i in range(Y.shape[1]):
        dY_dX[i,:,:] = grad(Y[:,i],X)

I have the following gradient function that I am using:

def grad(outputs, inputs):
	return torch.autograd.grad(outputs, inputs, grad_outputs=torch.ones_like(outputs), create_graph=True)[0]

The shape of dY_dX should be NxMx3. With the above-mentioned operation, it is possible to compute the gradients, however, it is computationally expensive. Can anyone recommend a possible faster way to compute the same without the loop?

1 Like