RPC - dynamic world size

Is it possible to have a dynamic world size when using torch.distributed.rpc?
I want to have a changing number of processes communicating using the TensorPipe backend, without explicitly stating a world size, having each process dynamicaly assigned a rank.

Hey @ItamarWilf,

Unfortunately, this is not yet possible with the RPC package, but it is in our roadmap.