Suppose that F(x) is a NN with 3 inputs and 1 output and I try to compute the gradient of F wrt the first input. But I got the error message: “One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.” What is the problem?
My code is as follows:
import torch
import torch.nn as nn
from torch.autograd import Variable,grad
NN = nn.Sequential(
torch.nn.Linear(3, 64),
torch.nn.ReLU(),
torch.nn.Linear(64, 64),
torch.nn.ReLU(),
torch.nn.Linear(64, 64),
torch.nn.ReLU(),
torch.nn.Linear(64, 1)
)
X = Variable(torch.rand(5,3), requires_grad=True)
F = NN(X)
G = grad(torch.sum(F),X[:,0],create_graph=True)[0]
print(G)