Custom Convolution

I want to create a custom convolution operation by overriding, if possible, the torch.nn.functional.conv2d(). For example, I want to “add” the network weights instead of “multiplying” them, just for example.
I don’t want to write the forward propagation from scratch as that won’t allow me to use the backward(). In that case I’d have to calculate the gradients and all those stuffs by myself.

you can write a new nn.module like this

class CustomConv(nn.module):
    def __init__(self):
        weights = nn.Parameter(torch.randn(3, 1))
    # initialize module

    def forward(self, input):
    # custom convolution here

and use it like this

conv = CustomConv()
features = conv(input)

and it will work with .backward()
but it will work like torch.nn.conv2d rather than torch.nn.functional.conv2d()
and you need to use nn.Parameter() for weights

First of all thanks for your response.
If I do it the way you said, where should I save all the filters and the activation maps so that it could be used in backward()?

you save the filters like this

class CustomConv(nn.module):
    def __init__(self):
        filters = nn.Parameter(# filter tensor)

so autograd registers the filter tensor as a parameter that needs to be optimized.

And you should return the activation map in the forward method like this

    def forward(self, input):
    # custom convolutions
    return activation_map