List of parameters without defining a module?

Could pytorch print out a list of parameters in a computational graph if the parameters are not in a module? For example, print the list of parameters until d in the following computational graph:

import torch
from torch.autograd import Variable
a = Variable(torch.rand(1, 4), requires_grad=True)
b = a**2
c = b*2
d = c.mean()

d.backward()

Hi,

No such function exist at the moment.
I guess you could traverse the graph using d.grad_fn and.next_functions, finding all the AccumulateGrad Functions and getting their .variable attribute. This would give you all the tensors in which gradients will be accumulated (possibly 0 valued) if you call backward on d.

Why do you need such function? Why don’t you already know which tensors are used in your computations?