An error occurred running distributed

    import torch.distributed as dist

dist.init_process_group(backend='tcp',
                        init_method='tcp://[ff15:1e18:5d4c:4cf0:d02d:b659:53ba:b0a7]:23456',
                        world_size=4)

print('Hello from process {} (out of {})!'.format(
        dist.get_rank(), dist.get_world_size()))

and I get an error:

RuntimeError: Address already in use at /Users/soumith/code/builder/wheel/pytorch-src/torch/lib/THD/process_group/General.cpp:17
1 Like

Hey, do you fix this problem?