Skip weights in covolution

Hey,

i am sorry if i missed a similar topic.
I want to skip a weight in a convolution layer. (I know bad description :wink: ).

Lets say i have a kernel like [a, b, c].
But i want a == c (or more precisely I want the optim. to threat it as one parameter).
Is there a proper way to archive this?
I am currently just setting them to (a + c) / 2 every step but this is not very satisfactoring.

Sorry if i am just a little bit stupid
And thank you very much for your time
NPC

Hi NPC!

To me the conceptually cleanest approach would be to build your
constrained kernel from a separate two-element Parameter that
you actually optimize. Something like:

import torch
from itertools import chain

# initialize  a = 1.1, b = 2.2
k = torch.nn.Parameter (torch.tensor ([1.1, 2.2]))
opt = torch.optim.SGD (chain (my_model.parameters(), (k, )), lr = 0.1)
# ...
# then in your forward pass
    # x = apply_some_model_layers (x)
    ker = torch.cat ((k[:2], k[0:1]))
    x = torch.nn.functional.conv1d (x, ker)
    # x = apply_some_more_model_layers (x)
    return x

I believe that the weight of a torch.nn.Conv1d has to
be a leaf tensor, so we have to use the functional form,
torch.nn.functional.conv1d (). By using differentiable
pytorch tensor operations to build ker from k, you will be able
to backpropagate through the construction of ker and properly
optimize the two elements of k.

Best.

K. Frank

1 Like

Hello KFrank,

thank you very much. I will try it this way :slight_smile:

Best.
NPC