When .grad is none?

Hi there,

I just have a quick question, assume I have a variable x and its type is :
type(x)
<class ‘torch.autograd.variable.Variable’>

When the x.grad is none and when the x.grad is not none?

Thanks!

Hi,

x.grad is None when you create the Variable.
It won’t be None if you specified requires_grad=True when creating it and you backpropagated some gradients up to that Variable.

Hi,

Thanks for the reply.

For example, in my case, I have:
total_rl_loss.creator
<torch.autograd._functions.basic_ops.Add object at 0x1195af138>
total_rl_loss.requires_grad
True
but total_rl_loss.grad is still none, do you know why?

Thanks!

Pretty sure a variable’s .grad member is only non-None after backproping some gradient to it. I.e., you need call .backward() in order for the variables in a given subgraph to have non-None .grad members

1 Like

I total agree with you. However, I found there is one interesting usage here:

in line 175, there is no .backward() has been called yet, but the outputs.grad is not None.

.backward() is called in line 173 of that file whenever the model is not in evaluation mode, ie it’s training. If .backward() is called, outputs.grad will be non-None as gradient is backpropogated to the variable. If the model is in evaluation mode, then .backward() is not called and None is returned (in this case outputs.grad would be None as well).

I see, Thanks for the explanations.

Here’s another scenario where a variable that requires a gradient has no gradients even after backward().

x = Variable(torch.ones(2, 2), requires_grad=True)
y = x + 2
y.retain_grad()
out1 = x.mean()
out1.backward()
print(y.grad,y.requires_grad)
out2 = y.mean()
out2.backward()
print(x.grad,x.requires_grad)

Outputs are:

None True
tensor([[0.5000, 0.5000],
        [0.5000, 0.5000]]) True

The reason is simple: x -> y but not y -> x, therefore dx/dy=0 (or None) but dy/dx is non-zero.

I get param.grad = None even after defining requires_grad=True. I checked and the variable does exist in the list(model.parameters()[0]). Have you encountered such an issue?

Hi,

You will need to be a bit more precise here.
Maybe open a new thread with a code sample showing what you’re trying to do :slight_smile:

Also note that setting requires_grad must be done before performing operations on the Tensor!

Yes, I will open a new thread for it :slight_smile: