How can I set my batchsize as a function that in every batch one sample from each class presented

Hi all
I want my batch in each iteration of epoch covers all claasses from my dataset. For example if MNIST is selected, the first batch will be like (0, 1, 2, 3, 4, 5, 6, 7, 8, 9). It covers the samples from all classes (i.e. 10 classes with labels 0 to 9).
Any example code in pytorch would be very appreciated.

I know what you’re trying but what if your dataset is imbalanced?
I mean you can obtain (0, 1, 2, 3, 4, 5, 6, 7, 8, 9) at the beginning but it will become to (4, 4, 4, 4, 4, 4, 4, 4, 4, 4) at the end of training (assuming it has a lot of 4 more than other numbers).

Consequently, I suggest you to use WeightedRandomSampler.
I give you an example pytorch_misc/weighted_sampling.py at master · ptrblck/pytorch_misc · GitHub

Thanks

Hi thanks for your time to answer me. I am confused about how to put all samples together in a batch and for simplicity, I assumed my dataset is balanced. I mean as you know the default method for batch collecting is random for example in the first batch you may have
(1, 2, 0, 4, 8, 4, 6, 3, 0, 4)
we don’t have any samples from classes 3, 5, 7, and 9.
I need to observe all labels in every batch.