Get exact formula used by autograd to compute gradients

I am implementing a custom CNN with some custom modules in it. I have implemented only the forward pass for the custom modules and left their backward pass to autograd.
I have manually computed the correct formulae for backpropagation through the parameters of the custom modules, and I wished to see whether they match with the formulae used internally by autograd to compute the gradients.
Is there any way to see this?
Thanks

Hi,

I am afraid this will be tricky as autograd does not build a single formula but compute the gradient as a chain of operations.
You can use things like https://github.com/szagoruyko/pytorchviz/ to see what is the set of functions that will be executed in the backward pass. But you will have to manually “unpack” these and write down the formula for each and then combine all of them together.

Note that if you want to make sure your custom Function (with the manual gradient) computes the right thing, you can use autograd.gradcheck().

Thanks @albanD.
I created a Pytorchviz visualization of my net. But it has some functions like ‘Slice backwards’ etc which I am not sure what they do.
Is the list of functions used in Pytorchviz same as the list of 'grad_fn’s in Pytorch?
Also, is the list of all 'grad_fn’s in Pytorch available somewhere?
Thanks

Is the list of functions used in Pytorchviz same as the list of 'grad_fn’s in Pytorch?

Yes it is just a visualization of the grad_fn in PyTorch :slight_smile: You can check the code, it just traverses the grad_fn and next_functions.

Okay, thanks for the clarification.
I have some grad_fns like ‘Slice backward’ in my pytorchviz graph, which I am not sure how it works. Where can I find what the function does or I need to get what it does on the basis of the operation in the forward pass?
Thanks

The specification for what the backward does is here And the functions it used can be found either here or with the other aten functions.
For slice in particular, it is here.

1 Like