My query is simple. Is there any way to deduce the exact Cupti library that PyTorch uses from a piece of Python code? Most times, torch uses the base version of the Cupti library that comes along with the CUDA toolkit installation. However, in more recent versions, torch has begun to ship a dedicated Cupti library as part of the torch installation.
I can use tools like LD_DEBUG and deduce this in an interactive environment. I wanted to know if there’s a way to automatically get this information from torch (even a hacky way would do)?
PS: I later intend to use this same library to add more tracing functionality and get some additional metrics.