Can someone explain this confusing error for me please?

I would get the error

RuntimeError: Expected tensor for argument #1 'input' to have the same dimension as tensor for 'result'; but 4 does not equal 2 (while checking arguments for cudnn_convolution)

If I run the code below. Quick googling returns various issues that cause this. I assume Pytorch doesn’t like the 9 in channels and 64 out channels?

import torch
import torch.nn as nn

real = torch.randn([32, 9, 32, 32])

class options():
    def __init__(self):
        self.ndf = 64

opt = options()    

netd = nn.Sequential(
            
            nn.Conv2d(9,opt.ndf,4,2,1,bias=False),
            nn.LeakyReLU(0.2,inplace=True),
            
            nn.Conv2d(opt.ndf,opt.ndf*2,4,2,1,bias=False),
            nn.BatchNorm2d(opt.ndf*2),
            nn.LeakyReLU(0.2,inplace=True),
            
            nn.Conv2d(opt.ndf*2,opt.ndf*4,4,2,1,bias=False),
            nn.BatchNorm2d(opt.ndf*4),
            nn.LeakyReLU(0.2,inplace=True),
            
            nn.Conv2d(opt.ndf*4,opt.ndf*8,4,2,1,bias=False),
            nn.BatchNorm2d(opt.ndf*8),
            nn.LeakyReLU(0.2,inplace=True),
            
            nn.Conv2d(opt.ndf*8,1,4,1,0,bias=False),
            
        )

netd.cuda()

test = netd(real)

Your input to the last conv layer is [batch_size, 512, 2, 2] which is too small for a kernel size of 4.
Change the last layer to nn.Conv2d(opt.ndf*8, 1, 2, 1, 0, bias=False) and your model should run.

Ops thanks for the solution!

Edited: Now during backprop of the output using that model I get

Expected tensor for 'result' to have the same dimension as tensor for argument #1 'grad_output'; but 4 does not equal 1 (while checking arguments for cudnn_convolution_backward_input)

eg :

output = netd(input)
output.backward( one )

You can probably tell I am using WGAN’s netd with no GP applied.

Edited: I solved this by calling output = output.mean() to get its mean then backprop from there, I don’t know if this is the best way but I am getting generations.