'steps per epoch' for dataloader

I have a large dataset (over 200,000 images) that I am using for semantic segmentation. My batch size is 1 due to the size of images and network parameters. I would like to train this model by iterating 4000-5000 images per epoch so I can run validation more often. In Keras, you have an option using fit_generator to assign how many ‘steps_per_epoch’ (batches of samples to be seen per epoch). Is there something similar already implemented in pytorch using the dataset/dataloader classes?

Link to keras’ fit_generator docs: https://keras.io/models/sequential/

The easiest way is to call your evaluation every x steps:

steps = 4000
for batch_idx, (data, target) in enumerate(train_loader):
    # your training routine

    if batch_idx % steps == 0:
        # your evaluation routine