Use all the gpus for training

I want to use multiple gpus for training. But as I have to do this training job on a cloud server, I cannot know the ids of gpu. However, the tutorial I have seen all need gpu ids to specify the usage. and

Therefore, I want to know if there is a method for me use all available devices?

You could use torch.cuda.device_count() to get the number of GPUs and then torch.cuda.get_device_properties(idx) to get the information of the device using idx.

Thank you! It works!