I try to implement a higher order gradient to optimizing parameters of networks, i.e.,
var_input = autograd.Variable(input_data, requires_grad=True) loss = model(var_input) gradient = autograd.grad(loss, input=var_input, create_graph=True, retain_graph=True, only_inputs=True) train_loss = loss_fn(gradient) gradients = autograd.grad(train_loss, input=model.parameters(), create_graph=False, retrain_graph=False, only_inputs=True)
However, I got an error
One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.
Could you tell me how to checkout which one tensor is not in the graph?