Uniform class batch training vs. Not uniform class batch training

When you train deep learning model with SGD, batch composes of randomly selected data or sequentially selected data as a default.
I want to compare between uniform class batch training and not uniform class batch training.
More specifically, for CIFAR-10 task, when batch size = 10, each batch has 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 labeled data. Otherwise, Not uniform class batch training means that each batch has 0, 0, 0, 2, 4, 5, 5, 7, 8, 9. You don’t care the order of labels.

For the first approach, you could access the targets in the CIFAR10 dataset and store the data indices for each label class.
Then inside a custom Dataset, you could load a sample for each class and return the complete batch in the __getitem__ method.
Note that for this approach you would manually create the batch in Dataset.__getitem__, thus you would have to set batch_size=1 in your DataLoader.

If I understand the second use case correctly, you are just randomly sampling the data, so setting shuffle=True in your DataLoader should work.