How could I get continuous batches when sample is fewer than batch size?


My dataset contains 40 samples, but I need to train my model with batchsize=64. I hope that I can fetch training sample batches continuously until certain number of iters is accomplished. I also need to shuffle the dataset if it is repeatedly used, how could I do this please ?

If you only have 40 samples, you could probably define the length of your dataset as e.g.

def __len__(self):
    return 2 * len(

and use index = index % len( operation inside __getitem__ to draw the current batch.

You can pass this dataset to a DataLoader and set shuffle=True to shuffle the data.

1 Like