How to find the source code of conv2d backward function

I want to custom a conv2d layer, so I need to change the code of forward and backward function of this layer.
but I can’t find where is the original backward function’s source code of conb2d function in pytorch.

Hi, @Tony_Lee

I think common operations/layers/functions/normalizations are defined in pytorch/pytorch/aten.

How about these 2?

1 Like

oh, I see, thanks !
Are there any python interface that I can use to customize the grad_weight and grad_input?

If you want to manipulate gradients directly, to module.register_backward_hook doc is one alternative.

Another alternative is to override nn.Conv2d.

yes,I have tried to define a conv2d layer, but it doesn’t work
if you have done this thing . could you share me your demo code
my code is like this, but it doesn’t work

class Conv2dF(Function):

def forward(cxt, input, weight, bias=None, stride=1, padding=0, dilation=1, groups=1):
    cxt.save_for_backward(input, weight, bias)

    return F.conv2d(input, weight, bias, stride, padding, dilation, groups)

def backward(cxt, grad_output):
    input, weight, bias = cxt.saved_variables
    grad_input = grad_weight= grad_bias = None

    if cxt.needs_input_grad[0]:
        grad_input = torch.nn.grad.conv2d_input(input.shape, weight, grad_output)
    if cxt.needs_input_grad[1]:
        grad_weight = torch.nn.grad.conv2d_weight(input, weight.shape, grad_output)
    if bias is not None and cxt.needs_input_grad[2]:
        grad_bias = grad_output.sum(0).squeeze(0)
    if bias is not None:
        return grad_input, grad_weight, grad_bias
        return grad_input, grad_weight

Sorry, but I’ve never experienced that thing…:pensive:

Anyway, thank you so much!

Hey Tony, I want to do the same thing as you do, modifying the conv2d layer. Did you figure this out?

yes ,I have solved it. I just want to define the weight , so I change the value of weight and send it into the conv2d function