I would like to use an iter on DataLoader instead of a for loop. At the beginning of each training step I use the following code:
train_loader = DataLoader(dataset=train_dataset, batch_size=8, shuffle=True, num_workers=4)
train_loader_iter = iter(train_loader)
while(True):
try:
data_dict = next(train_loader_iter)
except StopIteration:
print('Refreshing iterator...')
train_loader_iter = iter(train_loader)
data_dict = next(train_loader_iter)
print('Iterator refreshed...')
However when my iterator should refresh (i.e. I read ‘Refreshing iterator…’) the process hangs and gets stuck infinitely. Am I doing something wrong or this is just not possible? I’m on Pytorch 1.6.0 and my dataset is a simple subclass of the Dataset class.