Loading pre-batched data from directory

If I have a directory of .h5 files, each containing 500 images (of size (300,300,1)), how do I proceed to load each .h5 file, and then pass a batch_size quantity of images i.e. if batch_size = 2, I’ll pass 2 images of those 500, to have the neural network to train on? Currently I can’t think of how to do this without including a loop of some sort within the __getitem__ attribute of the Dataset class - though I think this will cause issues with it being called by the data loader.

Much appreciated,
xandrovich