Symetric convolution weights

If you have prior knowledge that your convolution weights should be symetrical on a given axis, how would you enforce that constraint ?

I came up with a solution that avoids hard constraints by increasing the loss if the weights are not symetrical.

def sym_loss(w, alpha=0.1, dims=(0,)):
    nb_w = w.shape[0]*w.shape[1]
    return alpha * torch.pow(w - w.flip(dims=dims), 2).sum().sqrt()/nb_w

Here, w corresponds to the weights of the convolutional layer of interest. The default dims enforces an horizontal symetry. The coefficient alpha indicates the importance of this loss in the overall loss.

If you have any feedback on this approach, please comment.

Personnaly I see one draw back which is that you still have to search and update weights on both sides of the symetrical axis while you could constrain your search space more efficiently and precisely is a string constraint defined in advance. Here you won’t have a guarentee of a symetry.

Hi,
I think you can achieve this by using a custom nn.Module.
You can write a nn.Module which has only half of the parameters for the convolution.
In the forward pass you just need to build the whole kernel but using that reduced amount of parameters and call the convolution functional.

This way you need no loss contrains.

1 Like

Thank you for your feed back. I’ll try that and post here the final module I came up with. In the mean time, you get the solution reward :slight_smile: