dsethz
January 10, 2023, 3:42pm
1
Hey,
Question : Is it feasible to install a CUDA-compatible version of torch
(and torchvision
) on a machine without a GPU (and no CUDA installed) (e.g. pip install pip install torch==1.10.1+cu111
)?
Context : I want to declare torch
as a dependency in my packaging meta-data. The project is a plug-in for a GUI-based software → intended for users without command line experience. Further, torch
must be functional irrespective of OS
and on GPU and CPU machines. However, automatically recognising if the machine has GPUs available at installation time does not seem to be possible (?, currently using poetry
).
Similar issues have been discussed here:
pytorch, python-poetry
opened 08:59AM - 05 Sep 22 UTC
area/docs
status/triage
<!--
Hi there! Thank you for discovering and submitting an issue with our doc… umentation.
Before you submit this; let's make sure of a few things.
Please make sure the following boxes are ticked if they are correct.
If not, please try and fulfill these first.
-->
- [x] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.
## Issue
As mentioned in issue https://github.com/python-poetry/poetry/issues/4231 there is some confusion around installing PyTorch with CUDA but it is now somewhat resolved. It still requires a few steps, and all options have pretty serious flaws. Below are two options that worked for me, on Poetry version `1.2.0`.
# Option 1 - wheel URLs for a specific platform
* You will need to pick the specific wheels you want. These are listed here: https://download.pytorch.org/whl/torch_stable.html. E.g. if you want CUDA 11.6, Python 3.10 and Windows, search that page for `cu116-cp310-cp310-win_amd64.whl` to see the matches for `torch`, `torchaudio` and `torchvision`
* In your `pyproject.toml` file add the URLs like:
```toml
[tool.poetry.dependencies]
python = "^3.10"
numpy = "^1.23.2"
torch = { url = "https://download.pytorch.org/whl/cu116/torch-1.12.1%2Bcu116-cp310-cp310-win_amd64.whl"}
torchaudio = { url = "https://download.pytorch.org/whl/cu116/torchaudio-0.12.1%2Bcu116-cp310-cp310-win_amd64.whl"}
torchvision = { url = "https://download.pytorch.org/whl/cu116/torchvision-0.13.1%2Bcu116-cp310-cp310-win_amd64.whl"}
```
* Run `poetry update`. It will download a _lot_ of data (many GB) and take quite some time. And this doesn't seem to cache reliably (at least, I've waited 30 minutes+ at 56 Mbps three separate times while troubleshooting this, for the exact same wheels)
Note that each subsequent `poetry update` will do _another_ huge download and you'll see this message:
```
• Updating torch (1.12.1+cu116 -> 1.12.1+cu116 https://download.pytorch.org/whl/cu116/torch-1.12.1%2Bcu116-cp310-cp310-win_amd64.whl)
• Updating torchaudio (0.12.1+cu116 -> 0.12.1+cu116 https://download.pytorch.org/whl/cu116/torchaudio-0.12.1%2Bcu116-cp310-cp310-win_amd64.whl)
• Updating torchvision (0.13.1+cu116 -> 0.13.1+cu116 https://download.pytorch.org/whl/cu116/torchvision-0.13.1%2Bcu116-cp310-cp310-win_amd64.whl)
```
# Option 2 - alternate source
```toml
[tool.poetry.dependencies]
python = "^3.10"
numpy = "^1.23.2"
torch = { version = "1.12.1", source="torch"}
torchaudio = { version = "0.12.1", source="torch"}
torchvision = { version = "0.13.1", source="torch"}
[[tool.poetry.source]]
name = "torch"
url = "https://download.pytorch.org/whl/cu116"
secondary = true
```
This seems to have worked (although I already had the packages installed) but it reports errors like `Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/pillow/`, but I think they get installed anyway (maybe a better message would be "Can't access pillow at 'https://download.pytorch.org/whl/cu116', falling back to pypi")
Also, if you later go on to do, say `poetry add pandas` (a completely unrelated library) you'll get a wall of messages like:
```
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/pandas/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/pandas/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/pytz/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/python-dateutil/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/numpy/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/pillow/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/requests/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/typing-extensions/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/certifi/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/urllib3/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/idna/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/charset-normalizer/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/python-dateutil/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/six/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/pytz/
Source (torch): Authorization error accessing https://download.pytorch.org/whl/cu116/six/
```
This happens with or without `secondary = true` in the source config.
Maintainers: please feel free to edit the text of this if I've got something wrong.
Best,
dsethz
ptrblck
January 10, 2023, 10:13pm
2
pip install torch torchvision
will install the default wheels hosted on PyPI, which ship with the CUDA 11.7 runtime, so it seems this would already solve your issue.
dsethz
January 11, 2023, 10:49am
3
ok, so do I understand you correctly, that installing a torch
version that ships with e.g. CUDA 11.7 does not introduce unintended behaviour on a machine that does not have GPUs available?
Yes, I don’t think installing the default pip wheels with the CUDA 11.7 runtime has any side effects besides the size increase (since CUDA libs will be downloaded and stored).
1 Like