Torch detects 1 GPU inside Jupyter but multiple from the CLI?

Not sure what could be the issue but when I do torch.cuda.device_count() inside a Python REPL it says 4, but from inside a Jupyter notebook it only reports 1. I’m running this on an Amazon EC2 instance (p3.16xlarge).

Could you check, if your Jupyter environment uses CUDA_VISIBLE_DEVICES somewhere, which would mask the devices and only make the passed ones visible?

@ptrblck I don’t see that variable using %env. Is there somewhere else I should check?

I don’t know, how Jupyter is launched and if specific env vars are passed it its launch.
However, checking the environment via export could work.
Also, I don’t know, if Jupyter itself uses some arguments to mask specific GPUs.