No CUDA 12.4 distribution for Linux when using pip?

Hi, Thanks for your work!

I’m wondering if this is a typo or if there is no real CUDA 12.4 distribution for Linux when using pip.

Currently in Start Locally | PyTorch, there’s no --index-url option on Linux with CUDA 12.4.

Up to CUDA 12.1, it needs to be installed with --index-url on Linux as well, but --index-url suddenly disappeared with CUDA 12.4.
I’m wondering if this is an accidental typo in the documentation or if there really is no support for the CUDA 12.4 distribution on Linux when using pip.

I googled to find out, but couldn’t confirm this.

Thanks for your help!

1 Like

I’ve read elsewhere that it now (2.5.1, and probably some earlier versions too.) installs CUDA by default on Linux, from the PyPI index (so no index needed.) I can’t see metadata info, you may have to look at what it installs.

1 Like

Thanks. I’d like to check myself with

import torch
print(torch.cuda.is_available())

after installing torch if I’m using Linux now, but I’m not.
If anyone could confirm this, I’d appreciate it!

you can use something like pip freeze | grep cuda and see whether torch+cuda is there as well.

but here the guys from astral/uv say that about the gpu installs: PyTorch | uv

note that, conversely, the cpu install now has an independent pip index as well for linux.

The default pip install torch wheels always shipped with CUDA runtime dependencies so just copy/paste the command and you’ll see CUDA 12.4 dependencies will be installed too.

2 Likes