DistributedDataParallel - need help

I am trying to train my network on a cluster of machines without GPU using DistributedDataParallel and read the tutorials associated but they are kinda terrible and many details are not explained.
For instance, what do the following lines do ?

os.environ['MASTER_ADDR'] = 'localhost'
os.environ['MASTER_PORT'] = '12355'  

Can I just copy them ? Also what is a rank,when do we call it and why is it even a variable ? None of these apparently important concepts are clearly explained … Can somebody recommend me a good tutorial on how to do parallel computing with pytorch ? The lack of resources is driving crazy (I have no knowledge of parallel computing)

Any help will be appreciated