Weight Sharing between Custom Convs

Hello,

For a Project I am using the Sparsity-Invariant Convolution which is implemented in a class. The dilation rate is an input into this class, so would have to be set when initialized.
Within a Model-Block, I want to use this convolution 3 times in parallel, with differing dilation rates. Is it possible to lock the weights between these 3 convolutions so they share the same weights.

This weight sharing would be like DetectoRS https://arxiv.org/abs/2006.02334

Thank you in advance for your help.

Yes, weight sharing is possible and a simple approach would be to use the functional API by defining the filters and bias as nn.Parameters and apply them via F.conv2d using the different dilation rates.

Thank you for your answer. Would this be a correct implementation of shared weights with a common gradient?

Convolution In init:

self.conv = nn.Conv2d(in_channels, out_channels, kernel_size,
                              stride=stride, padding=padding,
                              groups=groups, dilation=dilation, bias=False)

Share convolution in forward:

shared_weight = nn.Parameter((self.conv.weight))
shared_bias = self.conv.bias
x_d = F.conv2d(x_d, shared_weight, bias=shared_bias, stride = self.conv.stride, 
                          padding=self.switch_dilation, dilation=self.switch_dilation)

Don’t re-wrap the parameters in a new nn.Parameter, but use them directly via F.conv2d(x_d, self.conv.weight, bias=self.conv.bias, ...).

Thanks a lot :slight_smile: