My laptop uses a Windows system including two Geforce 880M GTX GPUs. I tried to update the Nvidia driver to version 425.31 with CUDA 10.1. And then install the pytorch using conda. The conda.cuda.is_avaliable() shows true, but in training, it shows the image cannot be found (see below). I tried a lot of version for old GPUs, and project123, it still cannot work. Please can someone help me? Thanks very much.
RuntimeError: cuda runtime error (48) : no kernel image is available for execution on the device at c:\new-builder_3\win-wheel\pytorch\aten\src\thcunn\generic/SpatialDilatedMaxPooling.cu:152
I think that gpu is too old.
You should try using earlier versions of pytorch or trying to compile yourself although I doubt nvidia support it. Pretty sure you can find supported gpus for different versions.
As @JuanFMontesinos has already mentioned, your GPU is quite old and you’re most likely going to have to install from source (rather than via conda or pip). The instruction for this are here, you’ll also have to install CUDA manually too but that’s all stated in the instructions.
If anyone knows the answer, it’ll be @ptrblck but I fear you’ll need to install from source rather than the quick install via conda or pip.
I do now ias @AlphaBetaGamma96 mentioned (if it is possible) it will work via source setup.
But again, it is old enough that I don’t know if you can compile newer pytorch versions with the cuda versions required for that gpu.
Again, if you install earlier versions of pytorch maybe you can.
Earlier means 0.4 or before?
You also have all the cuda version docs here
It’s a matter of checking your gpu somewhere I’d say.
A source build could work, but since the min. compute capability was set to >=3.5 for a long time, I don’t know if even an older CUDA toolkit would be able to build PyTorch. CUDA 11 dropped sm_30, so this would not work.
The answers by @AlphaBetaGamma96 and @JuanFMontesinos are correct: your GPU is too old and the binaries do not support it. You could try a source build with an older CUDA toolkit, but I would not guarantee it’ll work.