Jacobian of a Function of a Jacobian

Hi I am calculating a Jacobian of a function of a Jacobian. I have a vector valued function f and matrix valued fucntion g. Both differentiable.

y = f(x)
nabla = jacobian(y,x)
function_nabla = g(nabla)
hessian = jacobian(function_nabla, x)

nabla is calculated without problem so is g(nabla). Function g is standard pytorch function such as inverse.

I am using the following function for jacobian.

	def jacobian(y, x, create_graph=True):
		jac = []
		flat_y = y.reshape(-1)
		flat_y.retain_grad()
		grad_y = torch.zeros_like(flat_y)
		for i in range(len(flat_y)):
			grad_y[i] = 1.
			grad_x, = torch.autograd.grad(flat_y, x, grad_y, retain_graph=True, create_graph=create_graph)
			jac.append(grad_x.reshape(x.shape))
			grad_y[i] = 0.

		return torch.stack(jac).reshape(y.shape + x.shape)

I always get

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [1, 32]] is at version 64; expected version 63 instead.

I can’t pinpoint which operation is inplace here. Thank you.

1 Like