Install pytorch using system cuda and cudnn

I have an nvidia docker container with its own system wide cuda and cudnn. I’m trying to create several python environments, each with its own torch install. I was wondering if it’s possible to link these torch installs with the system cuda and cudnn so that I don’t have to download and install cuda and cudnn in each of these environments.

Can’t you just install those in a conda base environment ?

Yes, you can build PyTorch from source which would use your locally installed CUDA toolkit and cuDNN in a new virtual environment.

1 Like