FLOPS in Conv2d and ConvTransposed2d

Hi,

After calculating the FLOPS of the model (GAN), I found a strange point.

I compared following two cases:

1. Conv2d(kernel_size=3, stride=1, padding=1)
2. ConvTranspose2d(kernel_size=4, stride=2, padding=1)

The dimensions of the input channel and output channel are the same as 1024 and 512 respectively.

And my computation results are:

Conv2d
    - # of parameters: 4.719 M
    - GMac: 3.7

ConvTransposed2d
    - # of parameters: 8.389 M
    - GMac: 1.645

First, I would like to ask if the above results are valid.
If so, I’d like to ask why Conv2d’s FLOPS with a small number of parameters is larger than ConvTransposed2d.

The number of parameters seems to be correct and you could simply verify it via:

nb_params = conv.weight.nelement() # + conv.bias.nelement() (you can skip the bias, as it's insignificant compared to the weights)

Since your transposed convolution uses a stride of 2, the kernels will be applied to less patches than the conv layer. However, each kernel is larger, which will increase the number of theoretical flops again.

1 Like