Initialization error when using torch.distributed

I got this error when using initialization torch.distirbuted.

  File "run_classifier_gcn.py", line 393, in <module>
    main()
  File "run_classifier_gcn.py", line 309, in main
    rank=args.local_rank, world_size=args.world_size)
  File "/home/xdwang/ml4h/lib/python3.7/site-packages/torch/distributed/distributed_c10d.py", line 422, in init_process_group
    store, rank, world_size = next(rendezvous_iterator)
  File "/home/xdwang/ml4h/lib/python3.7/site-packages/torch/distributed/rendezvous.py", line 119, in _tcp_rendezvous_handler
    raise _error("rank parameter missing")
ValueError: Error initializing torch.distributed using tcp:// rendezvous: rank parameter missing

Where actually I have add the rank parameter when initialization.

    dist.init_process_group(backend=args.dist_backend, init_method='tcp://{}:{}'.format(ip_address, args.port),
                            rank=args.local_rank, world_size=args.world_size)

‘rank parameter missing’ ==> do you want to check whether the ‘args.local_rank’ is valid?

never mind, I got it work, yeah, I put the wrong rank at first, thanks!