How to update DataLoader in between epochs?

So, I am trying out curriculum learning, where the idea is to update the dataloader, after a certain number of epochs.
My current code looks like this -

for epoch in range(config.max_epochs):  
    print(f"Epoch: {epoch}")
    progress = tqdm.tqdm(total=train_batch_num,
                             ncols=75,
                             desc=f'Train {epoch}')
    model.train()
    total_train_accuracy=0
    total_train_loss=0
   ''' UPDATE TRAIN DATALOADER HERE'''
    for batch in train_dataloader:
      batch = {k:v.to(device) for k,v in batch.items()}
      optimizer.zero_grad()
      outputs = model(**batch)

I want to update the train_dataloader, as mentioned above.
Can I use something like train_dataloader = DataLoader(train_dataset, collate_fn=collate_fn, batch_size=config.batch_size, shuffle=True) to update the train_dataloader after a given number of epochs? I am currently not using multiple workers for the dataloader however this will be trained on a GPU. My doubts are : Will this update be reflected on the data loader during training?

Based on your code snippet, it should work. As long as you don’t modify the DataLoader while it is yielding batches (and expect batches to change halfway through), it should be fine.