Grad is None after backward() is called, and required_grad is True(!)


So I have no idea what’s going on.
Here is my code.

The output is def a function of the input (model is a pretty good[93%] gender classifier).

def compute_saliency_maps(X, y, model):
    # Make sure the model is in "test" mode
    # Wrap the input tensors in Variables
    X_var = Variable(X, requires_grad=True).cuda()
    y_var = Variable(y).cuda()
    scores = model(X_var)
    # Get the correct class computed scores.
    scores = scores.gather(1, y_var.view(-1, 1)).squeeze()
    # Backward pass, need to supply initial gradients of same tensor shape as scores.
    # Get gradient for image.
    saliency = #Here X_var.grad is still None! What the hell?!
1 Like

Please see this post for an answer Why cant I see .grad of an intermediate variable?

In your case, note that since you call .cuda(), the variable that you store in X_var is not the leaf Variable.
Either do
X_var = Variable(X.cuda(), requires_grad=True)

X_var = Variable(X, requires_grad=True)
X_var_cuda = X_var.cuda()
scores = model(X_var_cuda)

You guys are awesome!!

Thanks :slight_smile: