It has been a while since the last reply given in this thread. However, I have got a similar question and associated error, and thought it would be more appropriate to post it here. In PyTorch version 2.1.0+cu118
I have set up two different ways, Code I and Code II, to calculate second derivatives of a multivariable function. How can I successfully carry out it as in Code II?
Code I:
def f(x, t):
return (3 * x ** 3) * t + (t ** 3) * x
x = torch.tensor([2.06, 5.7], requires_grad=True)
t = torch.tensor([1.5, 0.33], requires_grad=True)
y = f(x, t)
dy_dt = torch.autograd.grad(y.sum(), t, create_graph=True)[0]
dy_dx = torch.autograd.grad(y.sum(), x, create_graph=True)[0]
d2y_dx2 = torch.autograd.grad(dy_dx.sum(), x)[0]
loss = dy_dt + dy_dx - 0.01 * d2y_dx2
print(loss)
Output I:
tensor([100.2378, 653.6338], grad_fn=<SubBackward0>)
Code II:
x = torch.tensor([2.06, 5.7], requires_grad=True)
t = torch.tensor([1.5, 0.33], requires_grad=True)
model = torch.nn.Linear(2, 1)
var_input = torch.stack([x, t], dim=1)
u = model(var_input)
du_dt = torch.autograd.grad(u.sum(), t, create_graph=True)[0]
du_dx = torch.autograd.grad(u.sum(), x, create_graph=True)[0]
d2u_dx2 = torch.autograd.grad(du_dx.sum(), x)[0]
loss = du_dt + du_dx - 0.01 * d2u_dx2
print(loss)
Output II:
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-40-bee5e3fcde20> in <cell line: 12>()
10 du_dt = torch.autograd.grad(u.sum(), t, create_graph=True)[0]
11 du_dx = torch.autograd.grad(u.sum(), x, create_graph=True)[0]
---> 12 d2u_dx2 = torch.autograd.grad(du_dx.sum(), x)[0]
13
14 loss = du_dt + du_dx - 0.01 * d2u_dx2
/usr/local/lib/python3.10/dist-packages/torch/autograd/__init__.py in grad(outputs, inputs, grad_outputs, retain_graph, create_graph, only_inputs, allow_unused, is_grads_batched, materialize_grads)
392 )
393 else:
--> 394 result = Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
395 t_outputs,
396 grad_outputs_,
RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.