Cupti library used by PyTorch

Hey folks,

My query is simple. Is there any way to deduce the exact Cupti library that PyTorch uses from a piece of Python code? Most times, torch uses the base version of the Cupti library that comes along with the CUDA toolkit installation. However, in more recent versions, torch has begun to ship a dedicated Cupti library as part of the torch installation.

I can use tools like LD_DEBUG and deduce this in an interactive environment. I wanted to know if there’s a way to automatically get this information from torch (even a hacky way would do)?

PS: I later intend to use this same library to add more tracing functionality and get some additional metrics.

can you derive it from information in torch.__config__.show()? It has info on what CUDA toolkit version the binary was compiled with.

pip list should also show the installed dependencies including the full version tags:

pip list | grep nvidia
nvidia-cublas-cu12            12.1.3.1
nvidia-cuda-cupti-cu12        12.1.105
nvidia-cuda-nvrtc-cu12        12.1.105
nvidia-cuda-runtime-cu12      12.1.105
nvidia-cudnn-cu12             8.9.2.26
nvidia-cufft-cu12             11.0.2.54
nvidia-curand-cu12            10.3.2.106
nvidia-cusolver-cu12          11.4.5.107
nvidia-cusparse-cu12          12.1.0.106
nvidia-nccl-cu12              2.19.3
nvidia-nvjitlink-cu12         12.1.105
nvidia-nvtx-cu12              12.1.105

E.g. this is the output of the current nightly binary.

2 Likes