Weight Sharing between Custom Convs

Thank you for your answer. Would this be a correct implementation of shared weights with a common gradient?

Convolution In init:

self.conv = nn.Conv2d(in_channels, out_channels, kernel_size,
                              stride=stride, padding=padding,
                              groups=groups, dilation=dilation, bias=False)

Share convolution in forward:

shared_weight = nn.Parameter((self.conv.weight))
shared_bias = self.conv.bias
x_d = F.conv2d(x_d, shared_weight, bias=shared_bias, stride = self.conv.stride, 
                          padding=self.switch_dilation, dilation=self.switch_dilation)