Iam getting this RuntimeError: size mismatch, m1: [100 x 6272], m2: [1568 x 10]

I have flattened in the forward function but still iam getting the same error. I have followed a tutorial and iam getting the same parameter size as he got. Iam trying the MNIST dataset

class CNNModel(nn.Module):
    def __init__(self):
        super(CNNModel,self).__init__()
        #Convo 1
        self.cnn1=nn.Conv2d(in_channels=1,out_channels=16,kernel_size=5,stride=1,padding=2)
        self.relu1=nn.ReLU()
        #Maxpool_1
        self.maxpool1=nn.MaxPool2d(kernel_size=2)
        #Convo_2
        self.cnn2=nn.Conv2d(in_channels=16,out_channels=32,kernel_size=5,stride=1,padding=2)
        self.relu2=nn.ReLU()
        
        self.fc1=nn.Linear(32*7*7,10)
    def forward(self,x):
        #Convo_1
        out=self.cnn1(x)
        out=self.relu1(out)
        #Max_pool1
        out=self.maxpool1(out)
        #Convo_2
        out=self.cnn2(out)
        out=self.relu2(out)
        out=out.view(out.size(0),-1)#Flattening out 
        
        out=self.fc1(out)
        return out

These are my parameters sizes

torch.Size([16, 1, 5, 5])
torch.Size([16])
torch.Size([32, 16, 5, 5])
torch.Size([32])
torch.Size([10, 1568])

Can you please help me where I did the mistake ?

You are only pooling once, so your activation before the linear layer has shape [batch_size, 32, 14, 14].
Add another pooling layer or change the number of input features of your linear layer to 32*14*14=6272.

Thanks a lot. I missed the second maxpooling