Using a common batch size with ConcatDataset

Hi Alban
Yes , in both MNIST and USPS dataset-the size of each sample is [1,28,28].
I am able to use each dataset independently with any batch size I want.
It throws me an error when I used torch.utils.data. ConcatDataset ( datasets ) for anyother batch sizes other than 1 or 2.
I will give an example.
Lets say I have batch size-128. In case of using MNIST alone, then the last iteration shall be of size mod(60000,128)= [96,1,28,28].
I guess this uneven balance is throwing me off when I concatenate the datasets.