Torch::cuda::is_available() false (again) - WIN 10

Ok, I upgraded Pytorch from master today, built successfully with cuda and cudnn support - Win 10, CUDA 9.2. But my C++ app started to report that torch::cuda::is_available() is false. There was a similar issue in the past - see here Now torch.dll and torch.lib seem to be just some placeholders, so I have no idea how to link my app correctly to force torch::cuda::is_available() to return true again in a C++ app.

Does anybody have an idea, what’s wrong here?

Thanks.

Alex

As a follow-up, I found out that linking against all the libs (c10.lib, c10_cuda.lib, caffe2.lib and caffe2_gpu.lib) makes the resulting binary depend only on c10.dll and caffe2.dll. That’s why torch::cuda::is_available() is false. In previous versions I was linking against torch.lib which made the resulting binary depend on torch.dll which in turn depended on both caffe2.dll and caffe2_gpu.dll.

This is a blocking issue for all Wiin C++ users (am I the only one experiencing this?). I do not even understand why torch.lib and torch.dll are just placeholders now…

Thanks.

Alex

Ok, this is a real issue that has to do with this and is also tracked here