Using ConvTranspose2d and Conv2d on the same input

Hello,

I am currently working on HoughNet (https://arxiv.org/abs/2007.02355) paper which has its code publicly available. I tried to change ConvTranspose2d in (https://github.com/nerminsamet/houghnet/blob/master/src/lib/models/networks/houghnet_large_hourglass.py#L280) with Conv2d. However, I get this error message:


RuntimeError: Given groups=1, weight of size [9, 1, 17, 17], expected input[2, 9, 128, 192] to have 1 channels, but got 9 channels instead

The original configuration is as follows:

Sequential(
  (0): ConvTranspose2d(9, 1, kernel_size=(17, 17), stride=(1, 1), padding=(8, 8), bias=False)
) 
input_size = torch.Size([2, 9, 1, 128, 192])
weight_size = torch.Size([9, 1, 17, 17])

The error occurs when the configuration is as follows:

Sequential(
  (0): Conv2d(9, 1, kernel_size=(17, 17), stride=(1, 1), padding=(8, 8), bias=False)
) 
input_size = torch.Size([2, 9, 1, 128, 192])
weight_size = torch.Size([9, 1, 17, 17])

As far as I understand, Conv2d should be applicable on this input. I tried replicate this on a small trial code. There I have to have a 4D input and I wasn’t able to replicate this error. Whenever, I have 4D inputs however, I can replace my conv layers with transposed conv and vice versa. Any idea why this might occur?

Thanks.

The weight_size of the nn.Conv2d module is wrong, as it would be:

conv = nn.Conv2d(9, 1, kernel_size=(17, 17), stride=(1, 1), padding=(8, 8), bias=False)
print(conv.weight.shape)
> torch.Size([1, 9, 17, 17])

Are you changing the .weight attribute manually in your code?

I simply changed the ConvTranspose2d with Conv2d and I get this error. However, the weight initially is assigned manually: https://github.com/nerminsamet/houghnet/blob/master/src/lib/models/networks/houghnet_large_hourglass.py#L276

I would like to reduce 9 channels to 1, do I have to have the matrix ordering swapped manually as well?

As shown in my minimal code snippet, the weight shape is wrong in your code, so you would have to permute it to the expected shape, i.e. swap dim0 with dim1.

1 Like