Pure Python pip install and external CUDA installation

Dear forum members,

On my new Windows 10 AMD Ryzen system I wanted to go for a pure python (non-Anaconda) based pytorch install, since I didn’t get numpy to work well with openblas on Anaconda.

So I created a fresh virtual environment, first installed numpy and then installed pytorch by running

pip3 install torch===1.3.0 torchvision===0.4.1 -f https://download.pytorch.org/whl/torch_stable.html

The installation finished but when importing torch I ran into the error

from torch._C import *
ImportError: DLL load failed: The specified module could not be found

I tried installing the cpu-only version of torch which had no issues.

I was finally able to fix the error on the GPU-version of pytorch by installing CUDA manually before installing pytorch. (again by running pip3 install torch===1.3.0 torchvision===0.4.1 -f https://download.pytorch.org/whl/torch_stable.html)

This confuses me as it was my understanding that pytorch brings it’s own CUDA and cudNN which are independent from the system’s CUDA installation.
So I would like to understand how installing CUDA could have fixed my problem?

I would be grateful for some insight into this behaviour, as I would like to understand how my pytorch installation is configured.

Thank you,
Paul

Hi,

pytorch does not bring cuda. It does bring cudnn though.
In conda, we have cuda as a dependency because conda package can contain cuda. But it’s not the case in pip, so you need to install the matching cuda version to be able to use cuda with the pip installed package.

2 Likes

Thank you! I am new to pytorch.
Yesterday I buy a new PC,and I did’t install the CUDA manually, but when I download the
pytorch cuda==10.1 with pip, I import it and test, it turns out the cuda is available! I don’t konw why!
I input the “nvcc -V” in CMD but it does’t work.
then, I installed the tensorflow-gpu in another pip venv, but it does’t work
I have no idea about if I have the CUDA!
I want to know how to confirm the CUDA is exist?
thanks!

Double post from here with an answer.