Share Kernel Between Dilated Conv2d Layers

I would like to perform 2d convolution on the same data using the same kernel but with different dilation rates. I do I do this kernel sharing?

Would it be best to initialize a kernel and then use that same kernel between multiple conv2d layers using the functional version as in the following?

Or should I make multiple layers with different kernels and then set the kernels equal to each other?

1 Like

Hi, I have the same question. Do you solve the problem?