I have a tensor of size [Batch, Channels, H, W]
I want to manually apply a single 5x5 filter on every channel for every batch equally. How exactly do I do this?
I tried the following but it doesnt work:
N=5
kernel = torch.Tensor(np.ones((N, N)).astype(np.float32)) # Example 5x5 Kernel
kernel = kernel.unsqueeze(0).unsqueeze(0)
kernel = kernel.repeat((1, Channels, 1, 1))
kernel = {'weight': kernel, 'padding': N // 2}
tensor_conv = F.conv2d(tensor, **kernel)
I get:
tensor = torch.Size([2, 96, 64, 64]) #
tensor_conv = torch.Size([2, 1, 64, 64])
Why is the channel of the conv now only 1. I want the 5x5 conv to be repeated for all channels.