Iterating over multiple dataloader

Hello,

I am trying to do a code that iterate over multiple dataloaders. Not just two (train and val, but 500 dataloaders)
I am iterating over a dataset, and on each data of the dataset I extract crop and apply a CNN on the crop.

I wonder if there is a way to purge the dataloader after each iteration since it uses all the resources of my computer and block it.
And then I have the following message


Exception ignored in: <bound method _DataLoaderIter.del of <torch.utils.data.dataloader._DataLoaderIter object at 0x7f7f6cc15160>>
Traceback (most recent call last):
File “/home/gianni/pytorch_venv/lib/python3.5/site-packages/torch/utils/data/dataloader.py”, line 399, in del
self._shutdown_workers()
File “/home/gianni/pytorch_venv/lib/python3.5/site-packages/torch/utils/data/dataloader.py”, line 378, in _shutdown_workers
self.worker_result_queue.get()
File “/usr/lib/python3.5/multiprocessing/queues.py”, line 345, in get
return ForkingPickler.loads(res)
File “/home/gianni/pytorch_venv/lib/python3.5/site-packages/torch/multiprocessing/reductions.py”, line 151, in rebuild_storage_fd
fd = df.detach()
File “/usr/lib/python3.5/multiprocessing/resource_sharer.py”, line 57, in detach
with _resource_sharer.get_connection(self._id) as conn:
File “/usr/lib/python3.5/multiprocessing/resource_sharer.py”, line 87, in get_connection
c = Client(address, authkey=process.current_process().authkey)
File “/usr/lib/python3.5/multiprocessing/connection.py”, line 487, in Client
c = SocketClient(address)
File “/usr/lib/python3.5/multiprocessing/connection.py”, line 614, in SocketClient
s.connect(address)
ConnectionRefusedError: [Errno 111] Connection refused


I saw that people have the same issue :

but that is not exactly the same.