Mpi4py and PyTorch

Hi,

I think it would be a known issue but I cannot find the correct way in the forum…

I have some VMs on the server and want to broadcast torch.tensor product by mpi4py with the following snippet but without success.

if rank == 0:
   x = torch.tensor([1], dtype=torch.float64)
if rank != 0:
    x = None
    x = comm.bcast(x, root=0)
print("Rank {} recieves {}".format(rank, x)))

How should I broadcast tensors between several ranks?

Thank you in advance.

Sorry… I messed up. The correct syntax is

if rank == 0:
    x = torch.tensor([1], dtype=torch.float64)
if rank != 0:
    x = None
x = comm.bcast(x, root=0)
print("Rank {} recieves {}".format(rank, x))

and now it works as I intend:

Rank 4 recieves tensor([1.], dtype=torch.float64)
Rank 9 recieves tensor([1.], dtype=torch.float64)
Rank 0 recieves tensor([1.], dtype=torch.float64)
Rank 1 recieves tensor([1.], dtype=torch.float64)
Rank 2 recieves tensor([1.], dtype=torch.float64)
Rank 3 recieves tensor([1.], dtype=torch.float64)
Rank 5 recieves tensor([1.], dtype=torch.float64)
Rank 6 recieves tensor([1.], dtype=torch.float64)
Rank 7 recieves tensor([1.], dtype=torch.float64)
Rank 8 recieves tensor([1.], dtype=torch.float64)
1 Like