Traversing Autograd Graph to Find Dependencies (from Python)

Suppose I have an output torch.Tensor that is computed from some nn.Module's forward(). My goal is to get, for each tensor t computed by a Function in the autograd subgraph for the nn.Module's forward(), the list of parameters (which should presumably also be from the same nn.Module) that are directly needed to compute t's gradient (as opposed to some intermediate activation tensor that is not one of the nn.Module's parameters).

Is this possible to do from Python? If not, is this possible to do from C++? What might a solution look like? At a high-level, it seems like we want to traverse the autograd graph starting from the output torch.Tensor's grad_fn's AccumulateGrad object and somehow access what each Function passed into save_for_backward(), checking the input against the nn.Module's parameters.