For me, using multiple workers is necessary, and it used to work (in July), but after an update to my PyTorch distro I started getting this error.
Setting num_workers=1 didn’t fix this issue.
Opened new thread here.
For me, using multiple workers is necessary, and it used to work (in July), but after an update to my PyTorch distro I started getting this error.
Setting num_workers=1 didn’t fix this issue.
Opened new thread here.