Shared weights in convolution filters

Hi,

I want to create a new module that implements a custom convolution layer that, given one input channel, produces two output channels using e.g. the following two filters:

Those two filters share the weights w0, …, w7 and the center weight is fixed to zero. Optimally, I want to use the filters together in one F.conv2d(inputs, filters) call.

Is this possible? If yes, how do I get the shared weights and the weights fixed to zero?

Thanks in advance!

Hi,

The simplest solution to get these is for you only to store a 1D Tensor that contains [w0, w1…, w7] and then reconstruct the filters on every forward before giving them to F.conv2d.

Ok, I understand.

But is there a way to construct this in the constructor and not the forward pass? Because if there are many input- and output-channels, this might be a lot of work for every forward pass.

Well not really. Because this needs to be recomputed at every forward like any other operation you perform in your forward pass.