Rules for knowing whether .grad will be None or not?

  • It’s unclear to me under what circumstances .grad will have a value

  • seems like it is ‘optimized out’ sometimes?

  • How to either request it not be ‘optimized out’ and / or obtain the value via some othre means?

    In [36]: a = autograd.Variable(torch.rand(2), requires_grad=True)

    In [37]: a.grad

    In [38]: b = a * 2

    In [39]: c = b * 2

    In [40]: c.backward(torch.ones(2))

    In [41]: c.grad

    In [42]: b.grad

    In [43]: a.grad
    Out[43]:
    Variable containing:
    4
    4
    [torch.FloatTensor of size 2]

    In [44]: b.requires_grad
    Out[44]: True

    In [45]: c.requires_grad
    Out[45]: True

So the rules are:

  1. Only user created Variables will have a .grad field (so b and c in your example will never have one).
  2. When the Variable is created, its .grad is None.
  3. When you call .backward() on a graph that contains this Variable, it’s .grad field will become a Variable containing it’s gradients and never go back to None ever.

Ok fair enough. So, basically, one way of thinking about this is, at least one of .creator and .grad are guaranteed to be None? No way that both .creator and .grad can be simultaneously non-None?

1 Like

Exactly.

Also FYI, .creator has been deprecated for .grad_fn.

Awesome thanks! And good info on .grad_fn. And by the way, I kind of hated the .creator name, and I think .grad_fn is … hmmm… moderately better :slight_smile: . But … grad_fn implies it is the fucntion to clculate the gradient, which I’m not sure is the case? Anyway, off-topic for this thread really…