How to access return values of backward for debugging comparison?

I’m writing a custom convolution autograd function and I want to compare the results of backpropagation to the official convolution implementation. How can I call backward for torch.nn.functional.conv2d and store the output?

For example, I’d like to compare the weight gradient, the input gradient, and the bias gradient computed by conv2d backward to my computation of the same values.

What I’ve tried

import torch

# Toy inputs, targets, and weights
input = torch.randn((5, 3, 7, 15))
weight = torch.randn((2,3,3,3), requires_grad=True)
target = torch.randint(0, 10, (5,))

# These two layers are so that I can have a loss gradient for backpropagation
linear = torch.nn.Linear(2*5*13, 10) #Get things into the right shape for the loss function
loss = torch.nn.CrossEntropyLoss()

# Forwards pass
x_conv = torch.nn.functional.conv2d(input, weight)  # This is the convolution that I want to know the gradients of 
x_linear = linear(x_conv.view(-1, 2*5*13)) # The following two layers are simply for getting a loss 
x_loss = loss(x_linear, target)

#Backwards
x_grad = x_loss.backward() # I thought that this would call the backward method of loss, linear, and coonv2d and return the input, weight, and bias gradients, but x_grad is empty

Have you already tried .parameters method ? Sorry, but your post is a bit confusing.

Let me try to explain again.

I have a class that I wrote which performs convolution. Let’s call it custom_conv. It is an autograd.Function, so it has a forward and a backward method. In the backward method, it computes the weight gradient, bias gradient, and input gradient, and returns those values. Currently custom_conv is just performing standard convolution with nothing new added.

I want to validate that custom_conv is working correctly by comparing to the gradients computed by pytorch’s conv2d function. Since these should be return values of conv2d’s backward method, I’m hoping I can access them somehow.

In the code snippet above, I simply wanted to show that calling .backwards() does not return these values. I’ve added a few more comments that hopefully make that clearer. Is there an alternate way to add a hook to return these values?

You suggest using .parameters(). What should I call this on? torch.nn.functional.conv2d does not have a parameters method. Neither does the tensor it returns. Are you suggesting that I should be using a convolutional layer instead? Like torch.nn.Conv2d?

I found the parameters you meant. You mean the the weight and bias are Parameter instances and that they contain the gradients as member variable grad. I was able to use those variable to compare and my method is working correctly :slight_smile:

Thank you.

For any future readers who are looking for the same thing, after executing the above code snippet, you can access the gradients as

weight.grad

Just one more question, input.grad doesn’t exist. Is there a way to access the input gradient?

Try setting requires_grad to True for your input before running x_loss.backward()

1 Like