How to use torchvision cpp api?

Hi dev.
Now I’m trying to use torchvision nms cpp module in my project.
But, with document in github, I keep failing to build torchvision.

my cmake version is 3.10.2.
and I’m using torchvision release/0.8.0 branch.

with those env, I keep having error as below

  By not providing "FindPython3.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "Python3", but
  CMake did not find one.

  Could not find a package configuration file provided by "Python3" with any
  of the following names:

    Python3Config.cmake
    python3-config.cmake

  Add the installation prefix of "Python3" to CMAKE_PREFIX_PATH or set
  "Python3_DIR" to a directory containing one of the above files.  If
  "Python3" provides a separate development package or SDK, be sure it has
  been installed.
CUDA_TOOLKIT_ROOT_DIR not found or specified
-- Could NOT find CUDA (missing: CUDA_TOOLKIT_ROOT_DIR CUDA_NVCC_EXECUTABLE CUDA_INCLUDE_DIRS CUDA_CUDART_LIBRARY) 
CMake Warning at /usr/local/lib/libtorch/share/cmake/Caffe2/public/cuda.cmake:31 (message):
  Caffe2: CUDA cannot be found.  Depending on whether you are building Caffe2
  or a Caffe2 dependent library, the next warning / error will give you more
  info.
Call Stack (most recent call first):
  /usr/local/lib/libtorch/share/cmake/Caffe2/Caffe2Config.cmake:88 (include)
  /usr/local/lib/libtorch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
  CMakeLists.txt:17 (find_package)


CMake Error at /usr/local/lib/libtorch/share/cmake/Caffe2/Caffe2Config.cmake:90 (message):
  Your installed Caffe2 version uses CUDA but I cannot find the CUDA
  libraries.  Please set the proper CUDA prefixes and / or install CUDA.
Call Stack (most recent call first):
  /usr/local/lib/libtorch/share/cmake/Torch/TorchConfig.cmake:68 (find_package)
  CMakeLists.txt:17 (find_package)

So I tried to downgrade torchvision to version v0.4.0.
But cuda error keep comming up.

If you have any advice about using torchvision cpp api, then pls tell me.

thanks for reading

regards.