I get the same situation with next(iter(data_loader)) (My /dev/shm is 256G). Set num_workers=0 indeed can fix this, but num_workers=0 will take more time to load datas, there is an issue of this situation, https://github.com/pytorch/pytorch/issues/13246, but can we have a better solution ?
2 Likes