Need help for tesla k40m gpu

Hi, i am working as intern at company and they have provided me gpu for learning and training. So far what i have done is I installed tf==2.10 with cuda toolkit 11.2.0 and cudnn 8.1.0 i was able to detect gpu with this. Now i want to install torch so that it detects the gpu which version of torch and cuda should i install? As i am unable to detect gpu with torch 2.0.1+cpu. I have also tried to find cuda version with torch.cuda.version but output is none. i am gonna use ultralytics and easyocr library for my work please help. My environment is just CLI and nothing else like in cmd: python and i am using pip to download packages and then installing them as my server is offline and has OS windows server 2016 running on it.

Your Kepler GPU is too old and is not supported in the PyTorch binaries anymore, as the binaries with CUDA 11.8 support sm_37-sm_90 while the binaries with CUDA 12.1 support sm_50-sm_90.
You could try to rebuild PyTorch from source for compute capability 3.5.

Thank you for the RepPly. From where i can rebuild and which version?

Here are the build instructions.
You should be able to build PyTorch from main as long as your used CUDA toolkit supports your architecture. Since your TF installation seems to work I would recommend using 11.2 to match it.

There is not pytorch 11.2 with cuda. Please suggest if i install cudatookit 11.3 with torch==1.12.1+cu113 will this work ? As i can see the drivers only till 11.2

It’s not a question about the used CUDA toolkit, but rather which PyTorch version supports your GPU. As mentioned before, the PyTorch binaries dropped support for sm_35 a while ago and I don’t remember which version was supporting it, but think it was removed in torch==1.9.0. can you check this table and confirm if this is authentic and would work?

No sorry, I didn’t create or maintain this table so don’t know how it was created.
To get the supported GPU architectures for current builds you can use print(torch.cuda.get_arch_list()) and compare it to the table (or ask the author).