Hi:
Does the current version libtorch support TensorRT as inference backend?
You can use TensorRT as an inference backend for LibTorch by installing Torch-TensorRT: GitHub - NVIDIA/Torch-TensorRT: PyTorch/TorchScript compiler for NVIDIA GPUs using TensorRT
Instructions on how to use the backend integration can be found here: Using Torch-TensorRT Directly From PyTorch — Torch-TensorRT master documentation
1 Like