Hi all,
is there a way to specify a list of GPUs that should be used from a node?
The documentation only shows how to specify the number of GPUs to use:
python -m torch.distributed.launch --nproc_per_node=NUM_GPUS_YOU_HAVE ...
This was already asked in this thread but not answered.
Cheers!