Dependency for torchvision and torchtext

I was downloading the latest versions of pytorch to install offline on a Windows VDI with CPUs. I ran the following

pip3 download -d torch       torch==1.4.0+cpu       -f https://download.pytorch.org/whl/torch_stable.html
pip3 download -d torchvision torchvision==0.5.0+cpu -f https://download.pytorch.org/whl/torch_stable.html
pip3 download -d torchtext   torchtext==0.5.0       -f https://download.pytorch.org/whl/torch_stable.html

The files downloaded are

..\torch
        77,395,110 torch-1.4.0+cpu-cp37-cp37m-win_amd64.whl

..\torchtext
           156,030 certifi-2019.11.28-py2.py3-none-any.whl
           133,356 chardet-3.0.4-py2.py3-none-any.whl
            58,594 idna-2.8-py2.py3-none-any.whl
        12,766,851 numpy-1.18.1-cp37-cp37m-win_amd64.whl
            57,952 requests-2.22.0-py2.py3-none-any.whl
         1,170,436 sentencepiece-0.1.85-cp37-cp37m-win_amd64.whl
            10,938 six-1.14.0-py2.py3-none-any.whl
       641,841,440 torch-1.4.0+cu92-cp37-cp37m-win_amd64.whl
            73,165 torchtext-0.5.0-py3-none-any.whl
            56,720 tqdm-4.41.1-py2.py3-none-any.whl
           125,624 urllib3-1.25.7-py2.py3-none-any.whl

..\torchvision
        12,766,851 numpy-1.18.1-cp37-cp37m-win_amd64.whl
         2,033,167 Pillow-7.0.0-cp37-cp37m-win_amd64.whl
            10,938 six-1.14.0-py2.py3-none-any.whl
       641,841,440 torch-1.4.0+cu92-cp37-cp37m-win_amd64.whl
           485,819 torchvision-0.5.0+cpu-cp37-cp37m-win_amd64.whl

Is the CUDA version of torch (torch-1.4.0+cu92-cp37-cp37m-win_amd64.whl) required for torchvision and torchtext to run them on a CPU?

Hi,

Thanks for reporting this.
That binary should still work right? It is just bigger than the cpu-only one.

It does. Size is a bit of an issue because I have to move these binaries after downloading.

Ok. we’re looking into it.
Hopefully this won’t be too much of an inconvenience in the meantime.

1 Like

Well, the issues for torchvision and torchtext seem different. A cpu-only package of torchvision should rely on a cpu-only package of pytorch so that’s really a bug, which we are fixing right now. But for torchtext, it is noarch package so we cannot pin the requirement to any version that user desire unless it adopts the same version specifier as pytorch and torchvision.

1 Like

Is there any way for me to force that (using a flag or some other command) during download? I am using pip3 download torch==1.4.0+cpu torchtext==0.5.0 -f https://download.pytorch.org/whl/torch_stable.html and I need to run it on a CPU-only Windows VDI.

the issue with torchvision / torch is fixed now.

This worked for me perfectly, it downloaded torch 1.4.0+cpu torchvision 0.5.0+cpu and torchtext 0.5.0

pip download torch==1.4.0+cpu torchvision==0.5.0+cpu torchtext==0.5.0 -f https://download.pytorch.org/whl/torch_stable.html

Files:

certifi-2019.11.28-py2.py3-none-any.whl
chardet-3.0.4-py2.py3-none-any.whl
idna-2.8-py2.py3-none-any.whl
numpy-1.18.1-cp37-cp37m-manylinux1_x86_64.whl
Pillow-7.0.0-cp37-cp37m-manylinux1_x86_64.whl
requests-2.22.0-py2.py3-none-any.whl
sentencepiece-0.1.85-cp37-cp37m-manylinux1_x86_64.whl
six-1.14.0-py2.py3-none-any.whl
torch-1.4.0+cpu-cp37-cp37m-linux_x86_64.whl
torchtext-0.5.0-py3-none-any.whl
torchvision-0.5.0+cpu-cp37-cp37m-linux_x86_64.whl
tqdm-4.41.1-py2.py3-none-any.whl
urllib3-1.25.7-py2.py3-none-any.whl