How to compile libtorch for mobile?

I’m building a cross-platform (Windows, Linux, macOS, Android, iOS) C++ library which runs a couple of ML models (inference only), currently using TensorFlow Light. With mobile support in latest PyTorch release, I’m considering to switch to it. But I’m not sure if I understand things right, so here are a few questions.

  1. I would want to link libtorch statically into my lib. Is libtorch mobile a minimalistic version of libtorch (kind of like TensorFlow Lite vs TensorFlow)? Or it’s just an interface to Android/iOS build systems and it runs a regular libtorch underneath? From this answer I understand it’s a regular libtorch, but just to be sure.

  2. If it is a separate minimalistic library, how to compile it with C++ interface only (minimal size possible)?

1 Like