Broadcasting list/tuple of tensors to all processes

Broadcasting one tensor to all processes in PyTorch is straightforward. But i got a tuple/list of tensor, How to broadcast them?

  • broadcasting the tuple/list fails

  • looping through the tuple/list and broadcast one at a time works fine for the first tensor, then it fails.

Any idea how to broadcast a tuple/list of tensor in PyTorch?

Would torch.distributed.broadcast_object_list work for you?