Cannot find "distributed" in torch.utils.data module

Here is an example using Pytorch 0.4.0

In [1]: import torch                                                                                           

In [2]: import torch.utils.data                                                                                

In [3]:                                                                                                        

In [3]: torch.utils.data.distributed.DistributedSampler()                                                      
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-3-89779c61da10> in <module>
----> 1 torch.utils.data.distributed.DistributedSampler()

AttributeError: module 'torch.utils.data' has no attribute 'distributed'

In [4]: torch.utils.data.DataLoader                                                                            
Out[4]: torch.utils.data.dataloader.DataLoader

I am importing the right modules, I can see the distributed submodule in the documentation, yet I have no idea why I can’t load this module… but able to load others like dataloader. In fact, when I hit Tab for autocomplete, the suggestions show me a bunch of submodules to import except distributed.

Am I missing something here?

Thanks.

As far as I know this class should be available in 0.4.0.
Could you check your version with print(torch.__version__)?
Maybe your current environment uses an older version?

I checked the version in runtime and its indeed Pytorch 0.4.0. This is happening on two different machines - my work is at a standstill since I have no idea how to fix this. Will appreciate any pointers.

Thanks.

I just checked it in 0.4.0 and apparently the import just seems to fail.
Run the following:

from torch.utils.data import distributed

sampler = distributed.DistributedSampler()
3 Likes

Thanks much - I’ll report this bug on Github, because I found tutorials in other sites (like Uber’s Horovod) which use this failing import.

1 Like

I don’t think it’s necessary as it’s working on the current master build. So this issue was probably fixed some time ago. :wink:

1 Like

Hi, ptrblck. I’m sorry to disturb you by using this way for that I really don’t have other ways for my problem. My problem is as shown in the link: https://discuss.pytorch.org/t/error-in-dist-scatter-when-mpi-backend-is-used/30426
Could you please do me a favor?

Sorry, but I think I can’t help you there as I’ve never used a distributed setup.

All right, thanks anyway. Could you please recommend somebody to me?