Batch Size issue in MLP

Is there any necessity to have the total number of files to be multiple of batch_size in pytorch? I am having multiple folders containing NumPy files, but each folder is not a multiple of 32 and I am getting size mismatch error while training the network? Is there any fix or tips on loading data so that size mismatch does not occur?

@AshviniKSharma could you please share a screenshot of the error, so as to get a clear idea of the size mismatch?

I dont have the screenshot right now, as I have manually set all folder files to be multiple of 32 , but its not working when total files are not multiple of batchsize(32).

@AshviniKSharma Try increasing the number of files in each folder (not necessary multiples of 32) and check if it would make any difference.