That’s what it does.
ChatGPT suggested to try running:
"import torch
print(torch._version_)
print(torch.cuda.is_available())
print(torch.version.cuda)
print(torch.cuda.nccl.version())" before trying the llama chatbot.
The only one that didn’t work was “print(torch.cuda.nccl.version())” with the error:
“Traceback (most recent call last):
File “<stdin>”, line 1, in <module>
File “C:\Python310\lib\site-packages\torch\cuda\nccl.py”, line 35, in version
ver = torch._C._nccl_version()
AttributeError: module ‘torch._C’ has no attribute ‘_nccl_version’”.
P. S.
If this helps then:
“nvcc --version” returns “Build cuda_12.4.r12.4/compiler.33961263_0”,
“torch.version.cuda” returns “12.1”