Given groups=1, weight of size [64, 32, 4], expected input[1, 64, 128] to have 32 channels, but got 64 channels instead

I am trying to generate time series data using convolutional GAN.
My model is here:
seq_size = batch_size
n_features = X2_scaled.shape[2]

Discriminator

D = nn.Sequential(

nn.Conv1d(32, 64, kernel_size=4, stride=2,padding=2,dilation=1),#32
nn.BatchNorm2d(64),
nn.ReLU(),

nn.Conv1d(64, 64, kernel_size=3, stride=2, padding=1,dilation=2),
nn.BatchNorm2d(128),
nn.ReLU(),

nn.Conv1d(64, 32, kernel_size=1, stride=2, padding=0,dilation=4),
# nn.BatchNorm2d(4),
nn.Sigmoid()    

)

Generator

G = nn.Sequential(
nn.Linear(4, 7),# as z is of (latent_dim,4)

nn.ConvTranspose1d(7, 64, kernel_size=3, stride=1,padding=1),
nn.BatchNorm1d(64),
nn.ReLU(),

nn.ConvTranspose1d(64, 64, kernel_size=2, stride=1,padding=1),
nn.BatchNorm1d(64),
nn.ReLU(),
        
nn.ConvTranspose1d(64, 32, kernel_size=1, stride=1,padding=1),
# nn.BatchNorm1d(4),
nn.Tanh(),
)

D = D.to(device)
G = G.to(device)

training

total_step = len(data_loader)
for epoch in range(num_epochs):
for i, (samples) in enumerate(data_loader):
samples = samples.reshape(batch_size, -1).to(device)
print(samples.shape) # giving original input shape (568,32,4)
→ outputs = D(samples.float())

Here I am getting the runtime error: Given groups=1, weight of size [64, 32, 4], expected input[1, 64, 128] to have 32 channels, but got 64 channels instead.
I am unable to see my mistake here. please help me. Thanks in advance

The shape in this comment

cannot be true, since you are reshaping the tensor to two dimensions only.

nn.Conv1d expects a batched 3D tensor in the shape [batch_size, channels, seq_len] or an unbatched 2D tensor in the shape [channels, seq_len]. In the latter case the layer will unsqueeze the tensor’s missing batch dimension and will set a sample size of 1 to it.
I would generally recommend unsqueezing the batch dimension explicitly, as I don’t think this “flexibility” of accepting inputs without a batch dimension is a well designed feature, which mainly creates confusion.
After fixing this issue you would most likely run into a new one since you are using nn.BatchNorm2d while I assume it should be nn.BatchNorm1d.

Yes when I was not reshaping it, was showing ValueError: expected 4D input (got 3D input).
Now I see why it was there. Had a serious mistake by taking BatchNorm2d().
Thank you so so much. Now I don’t get the same error.