After 2.4, torch is chaning the way customer operators are are created in C++. Previously for torchscript I could generate a custom library that could be exported to a triton server like : Custom Operations — NVIDIA Triton Inference Server.
How would I be able to do that in the new way? I don’t see that part in the manual.
Thanks in advance