Is it OK to change dataset during training?

I use dataset and dataloader for my training script.

Is it OK to change dataset parameter during the traininng??

For example,

dataset = CustomDataset()
loader = DataLoader(dataset, num_worker=8)

for i, data in enumerate(loader):
     dataset.x = new_x

I saw that DataLoader use multithread and dataset is copied to other thread.

The change of dataset is reflected to the loader?

Thanks,

No, if you are using multiple workers, the changes won’t be reflected.
What is your use case?
Maybe you could use shared arrays?

1 Like

My use case is for dataset to create sample using sample function.

During training, sampling strategy is changed.

I think I have to find alternative way… :frowning:

Hello, what if I change dataloader.dataset.x like in this post?

Will that be reflected to multiple workers?

I don’t think so as stated in my reply.