Hello, this might be an implementation-detailed question.
When reimplementing decoupled neural interface, which means I have to modify Variable.grad
after calling Variable.backward
, I found out that I can’t find Variable.is_leaf
in any variables. But the documentation says it exists. Also, because I have to make sure it’s valid to manually modify grad
Tensor (yes, it is valid, I hope so?), I looked up the source code under torch.optim
folder, and Variable.is_leaf
does exist. So How or When does this attribute be set? Thanks in advance!
This is how I tested
In [1]: import torch
...: from torch.autograd import Variable
...:
...: x = Variable(torch.randn(5, 5), requires_grad=True)
...: y = Variable(torch.randn(5, 5), requires_grad=True)
...:
...: a = x * 2
...: b = y * 3
In [2]: a.sum().backward()
In [3]: a.is_leaf
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
......
P.S. I just found out that grad_fn
is not set as well?