Implementation Customized Convoultion Kernels

Hi,

I’m going to implement an convolution layer with custom kernels.
For each location in tensors, I want to do convolution with different kernels.
For example:
for input[:,:,4,5], do convolution with kernel1,
for intput[:,:,4,6], do convolutino with kernel2.

How can I implement it?
Thanks!

Regards

1 Like

You can:

  1. Create a convolutional kernel
    conv = nn.Conv2(in, out, ksize, ...)
  2. Copy your kernel weights to it
    conv.weights.copy_(myweights)

Just be aware that you will need to use torch.no_grad if you dont want that convoluton’s weights to be updated while training.

Thanks! @imaluengo I will try it.

Also for avoiding grads, this might help:

for p in conv.parameters():
    p.requires_grad = False
1 Like

Have you solved this?