Hi. I have a tensor which is made up of 8 matrices, and each matrix is 20x20 numbers. I am trying to do a convolution on this matrix but I don’t understand the output dimensions. Here is what I have so far (the x is my tensor):
class CNN(nn.Module):
def __init__(self, n_filters, filter_sizes): super(CNN, self).__init__() self.conv0 = nn.Conv2d(in_channels=1, out_channels=n_filters[0], kernel_size=filter_sizes[0]) def forward(self, x, lengths=None): x = x.unsqueeze(1) map0_conv = F.relu(self.conv0(x)) map0_avg = torch.mean(map0_conv, dim=1)
My filter size is 2x2, and the stride is 1. Therefore, for each matrix I need 19*19 = 361 filters. Thus, starting with an 8x20x20 tensor, I end up with the dimensions of self.conv0(x) being 8x361x19x19.
I don’t know what this result means. It’s causing me problems in the next step, the torch.mean. What I’m trying to do, is to take a 8x20x20 tensor, apply a convolution to it and then get the average of each convoluted matrix to end up with one vector containing 8 entries. Clearly I’m doing it wrong but I don’t know how to fix it.