DistributedPytorch

Does pytorch allow for using multiple computers to parallelize training? I have 8 computers with no GPU(only cpu) and want to parallelize it.

I have read this, but this doesnt have information to setup other PCs for distrubuted training

https://pytorch.org/tutorials/intermediate/dist_tuto.html

If yes, please suggest an example with setup details