Hello, I am interested in having the model inference in C++. The model is trained in python, I have a few questions regarding libtorch vs Custom C++ extensions of pytorch.
the main difference between libtorch vs Custom C++ extensions of pytorch (can both be used for inference?)
Which one is better in terms of both ease of use and performance in terms of inference.
can you convert any PyTorch model be used for inference in C++ using libtorch and/or C++ extensions.
Finally, which method is recommended for future use.
P.S: by custom C++ extensions I mean this