ValueError: Expected input batch_size (1) to match target batch_size (40000)

i have a trainingdata(torch.size([3,3,200,200]) and i put it in my convolution network,but get error.
how can i fix it?

class ConNet(nn.Module):
    def __init__(self):
        super(ConNet, self).__init__()
        self.ConNet1 = nn.Sequential(
            nn.Conv2d(
                in_channels=3,
                out_channels=24,
                kernel_size=3,
                stride=1,
                padding=1,
            ),
            nn.ReLU(),
            nn.AvgPool2d(
                kernel_size=2,
                stride=2,
            ),
        )
        self.ConNet2 =nn.Sequential(
            nn.Conv2d(24,12,3,1,0),
            nn.ReLU(),
            nn.AvgPool2d(2,2)
        )
        self.classifier=nn.Sequential(
            nn.Linear(12*49*49,256),
            nn.ReLU(),
            nn.Linear(256,128),
            nn.ReLU(),
            nn.Linear(128,40000)
        )
    def forward(self,x):
        x=self.ConNet1(x)
        x=self.ConNet2(x)
        x=x.view(x.size(0),-1)
        output=self.classifier(x)
        return output
connet=ConNet()
print(connet)

This has nothing to do with your network. You network is fine I tested it by copying your code and putting torch.rand(3,3,200,200) through it.
This error comes from your loss function. The output of your network seems to be a tensor of size (1, 4000) while the target seem to be of size (4000).
Since you said your input is batchsize 3 none of that makes sense.
Did you maybe use batchsize 1 when this error occurred? If so you can make it work by just unsqueezing the target like this target = torch.unsqueeze(target, 0)
Or does it also happen with batchsize higher than 1?
If so can you post your code. Specifically the bit between the output of your net and where your loss function is called.

Thank you very much for answering my question. The error dose come form batchsize 1, and i use 'unsqueezing ’ that can solve error. The error doesn’t come from my network. If you were confuesd by my network, i’m very sorry for that.
And thanks again.