Distributed training in multinode

yeah i have changed from localhost to master_ip

is it fine if i change like this?

master:

world_size = 2
mp.spawn(example,
	args=(0,1),
	nprocs=world_size,
	join=True)

other:

world_size = 2
mp.spawn(example,
	args=(2,3),
	nprocs=world_size,
	join=True)