So I am trying to have two data loaders emit a batch of data each within the training loop. Like so:
data_loader1 = torch.utils.data.DataLoader(train_set1, batch_size=run.batch, shuffle=run.shuf, drop_last=True)
data_loader2 = torch.utils.data.DataLoader(train_set2, batch_size=run.batch, shuffle=run.shuf, drop_last=True)
for image_batch, labels in data_loader1:
image_batch2, labels2 = next(iter(data_loader2))
#code within training loop
This works right up till the point that the second data loader runs out of images… apparently next(iter()) will not go back to the beginning of the dataset.
This post explains: https://stackoverflow.com/questions/48180209/stop-iteration-error-when-using-next
So the problem is to have two data loaders emit a batch each within the same loop… but without using next(iter)) or creating nested for loops (for computing reasons).
Any ideas???