Distributed Training Across Multiple Cloud Providers

I’m trying to train a model with instances from AWS and Google Cloud. Unfortunately neither GLOO or NCCL seem to work with multiple cloud providers. I tried using an Algo VPN to run all the instances in a local setting, but that failed to work as well. Does Pytorch support training across multiple cloud providers?

Although I don’t think it’s a good idea but PyTorch/Gloo/NCCL doesn’t know anything about “cloud providers” so the problem must be only in the network configuration(nodes must be visible to each other) and network latency/throughput