Dynamic data iterator value change during training

Hi!

My Data Loader :

LoadData(Dataset):
    def __init__(self, ..., ...., ...):
        self.ns = (640, 640)
        .....
        .....

   def __getitem__(index):
       img = resize(img[index], self.ns)
       .....
       .....
       return img

   def set_size(self, ns):
       self.ns = ns

And the training loop

for img in dataiterator:
    forward(img)
    backward()
    datatiterator.dataset.set_size(new_ns)

I would like to resize images dynamically after each iteration. When I do it as above, it does not work. Is there any simple trick to do that? Please help.

Best,

You won’t be able to manipulate the underlying .dataset through the DataLoader, if you are using multiple workers during the epoch, since each worker will use a separate copy of the Dataset.
You could manipulate the .dataset after each epoch and before the start of a new one (if persistent_workers=False) or you could iterate the Dataset and create the batches manually.
Alternatively, you could also try to forward specific arguments to the __getitem__ through a custom sampler to switch between different behaviors.

1 Like

Thank you for your answer. Let me see how I can create a custom sampler and control the image size through sending argument to getitem.

1 Like