Why 2 GPUs is used even I set just one?

Here it is:

 the_device=torch.device("gpu:2")
 myModel.to(the_device)

The GPU 2 is supposed to be used solely, but gpu 0 and gpu 2 come into play. I am not sure what is wrong with my code .

I miss something ?

Thanks in advance for your help

Maybe you should use replace “cuda:2” with “gpu:2” ?

Moreover, you can also change used GPU with a torch.cuda.device() or torch.cuda.set_device().

You can refer this to set up and run CUDA operations.

Sorry, typo, I did use the “cuda:2” and not work well.

Did you use torch.cuda.device() or torch.cuda.set_device()? It works for me.

If you already used torch.cuda.device() but it did not work. Check this first.

You are right. …cuda.set_device works .
BTW, what is the recommended way to set the gpu things? From the examples given by pytorch, it seems .to(device ) used .

Anyway, thank you very much .

This is a bug. If your pytorch version is 0.4.1, please let me know your model definition and I can add it to this tracking issue: https://github.com/pytorch/pytorch/issues/10832