Hi,
I trained a neural network model and would like to compute gradients of outputs wrt to its inputs, using following code:
input_var = Variable(torch.from_numpy(X), requires_grad=True).type(torch.FloatTensor)
predicted_Y = model_2.forward(input_var)
predicted_Y.backward(torch.ones_like(predicted_Y), retain_graph=True)
where X is the input data, model_2
is the neural network. But I got None as input_var.grad
. I googled it but most issues are related to that requires_grad was not set to True, which is not the case here. I wonder if anyone know what might be the problem?
Thank you!