A inifinite dataloader

I used a dataloader to get the batch of some image data.My idea is that I set the max_epoch manually, Now I use only one image to overfit the network to see it performance.And I use for data in iter(Mydataloader) to get the batch data.But it seems after one epoch ,the dataloader exhausted and generate nothing, code is still running without warning,and the training loop is not running for a exhausted dataloader.
How to modify? I want to get data in every epoch.So I need a dataloder to go through the whole dataset in one epoch,after this epoch ,it regenerate an iter for me.Now I need to re-assign the train_iter=iter(train_loader) before every train loop,because the last one is exhausted.

Can you post your source code?

train_set=Dataset(args.data_dir,parttern='train')
test_set=Dataset(args.data_dir,parttern='test')
train_loader=Dataloader(dataset=train_set, batch_size=args.batch_size, shuffle=True, num_workers=0, pin_memory=True, drop_last=False)
test_loader=Dataloader(dataset=test_set, batch_size=args.batch_size, shuffle=True, num_workers=0, pin_memory=True, drop_last=True)
for i in tqdm(range(start_epoch+1,args.max_epoch+1)):
    model.train()
    #need to re-assign here 
    train_iter=iter(train_loader)
    test_iter=iter(test_loader)
    for origin,mask,inpaint in train_iter:
        origin=origin.to(device)
        mask=mask.to(device)
        inpaint=inpaint.to(device)
        result=model(origin, mask)
        loss_dict=inpaint_crit(origin, mask, result, inpaint)
        loss=0.0
        for key,value in loss_dict.items():
            loss+=loss_dict[key]
            writer.add_scalar('loss_{:s}'.format(key),value,i)
        optimizer.zero_grad()
        loss.backward()
        optimizer.step()

Can you format that as code?

I don’t know how,seems the space not working…

put three backticks around your code (`)

I’ve formatted that.

Is there any reason for creating the iterators?

you could simply do

for origin, mask, inpaint in train_loader:

which is what I do all the time and works fine for me.