I am trying to compute the Jacobian matrix in a differentiable manner using torch.autograd.functional.jacobian with create_graph=True, but the dimensions of outputs and inputs are too large and the GPU running out of memory.
I found I just need some specific columns of the Jacobian matrix, ie the derivative of outputs w.r.t. some specific parts of inputs. If I can calculate that part of derivative in a differentiable manner, that would save lots of memory.
So I tried the following:
x = torch.tensor([1., 2., 3., 4., 5., 6.], requires_grad=True)
y = x
d = torch.autograd.grad(y[0], x[0], retain_graph=True, create_graph=True, allow_unused=True)
print(d)
output:
(None,)
But just got None.
Is there any way to compute derivatives of outputs w.r.t. parts of inputs in a differentiable manner?
Or any other methods of computing specific columns of the Jacobian matrix in a differentiable manner?