We wanted to let you know that we are considering the deprecation of DataParallel (
torch.nn.DataParallel, a.k.a. DP) module with the upcoming v1.11 release of PyTorch. Our plan is to keep DataParallel in maintenance mode for the 12 months following the v1.11 release and afterwards completely remove it from our code base. No matter whether we follow this plan or not, we still highly encourage everyone relying on DataParallel to onboard with the Distributed Data Parallel (
torch.distributed, a.k.a. DDP) module as we consider it the future of PyTorch Distributed.
Your feedback is very important to us, so please let us know your thoughts on our GitHub post.