Python API for autograd graph

I am trying to understand how backprop is achieved by the autograd engine. Learned that when new tensors are created it includes a grad_fn attribute that tells us about the operation that created it and it in turn has has next_functions that converys information about the parent node. But during backprop operations like matmul will require the input during forward propagation and are they exposed through some python functions.?? I understood that they are stored in some autograd metadata so that autgrad engine has access to how nodes are created in the forward pass. Is there a way to access this graph in python?


grad_fn and next_functions are the only things that are available from python I’m afraid. Other constructs are more complex and specific to each Node. So we don’t have custom python bindings for each of them.
What kind of informations are you looking for?

inp = torch.randn(3,3, requires_grad=True)
another = torch.randn(3, 3, requires_grad=True)
inter = inp * 3
inter2 = inp * 4
result = torch.matmul(inter, inter2)

We knew that result was created using matmul in forward prop, but autograph should also record the information about the parent node inter, inter2 and their corresponding values in order to do backprop rite? Are those dependency and edge information available through python?. For eg in custom autograd function we have ctx for saving info for backward, are those ctx information exposed in python?

For most ops no. Because they can save any value they want, we don’t have a unified API to access these values.