I have a model where the training logic requires looking at two different batches in each iteration. How can I use DataLoader
for this?
simply take 2 batches:
batch_1=next(data_loader)
batch_2=next(data_loader)
1 Like
Here is the code snippet I ended up using:
train_dl_cp = copy.deepcopy(train_dl)
for epoch in range(epochs):
it = iter(train_dl_cp)
for b, x1 in enumerate(train_dl):
x2 = next(it)
.... # do something with x1 and x2
I think this should be fine.