However, the real way around this problem lies in re-factoring your code to comply to Python’s Windows-specific multiprocessing guidelines as discussed here in this StackOverflow thread.
This subject is touched upon in Python 2 documentation for multiprocessing: Programming Guidelines, Windows
. While Python 3 documentation shares similar guidelines (see here), the Python 2 is more explicit with Windows:
Make sure that the main module can be safely imported by a new Python interpreter without causing unintended side effects (such a starting a new process).
In short, the the idea here would be to wrap the example code inside an if __name__ == '__main__'
statement as follows:
# Deep Learning with PyTorch: A 60 Minute Blitz » Training a classifier
# Load the CIFAR10 data
import torch
import torchvision
import torchvision.transforms as transforms
# Safe DataLoader multiprocessing with Windows
if __name__ == '__main__':
# Code to load the data with num_workers > 1
While the tutorial seems to define multiple scripts, the best way around this would be to wrap all operations in functions and then call them inside an if __name__ == '__main__'
clause:
# Imports for dataset generation, training, etc
def load_datasets(...):
# Code to load the datasets with multiple workers
def train(...):
# Code to train the model
if __name__ == '__main__':
load_datasets()
train()