Torch.cuda.is_available() is not GPU

Hi,

I am unable to use the gpu on my gpu server despite creating a conda environment with pytorch-cuda. Could you please help me solve this issue?
When I run:
torch.device(‘cuda’ if torch.cuda.is_available() else ‘cpu’) I get ‘cpu’
torch.version is 2.1.1
torch.version.cuda is 12.1

I am using an NVIDIA A100 40GBGPU through slurm
When I run nvidia-smi I get:
Driver Version: 535.104.12
CUDA Version: 12.2

Of PyTorch does not detect your GPUs you might need to check your drivers as I can properly use the latest binaries in servers with A100s.