TorchScript vs torch2trt for using models in C++

Hi,

I would like to serialize my PyTorch model to use it with my C++ application in production. I came across the use of TorchScript (as documented in PyTorch) and torch2trt.

Is there a difference in the serialization achieved by the two?

Currently,

  • TorchScript is more general - as in it supports more operations,
  • you probably will not beat TensorRT for those things it supports.

Best regards

Thomas

@tom Thanks for your reply. Could you elaborate about the generality of TorchScript? And if TensorRT is better at the things it supports, will it be a good idea to have a mixture of the two working?

Regards,
Arpit

Well, TorchScript can represent arbitrary PyTorch operations, including custom operators.
TensorRT does not support all of them.

Whether mixing the two is desirable depends on your parameters. It is a bit more effort, but it would seem that torch2trt lends itself to that.