How to compile libtorch for mobile?

I’m building a cross-platform (Windows, Linux, macOS, Android, iOS) C++ library which runs a couple of ML models (inference only), currently using TensorFlow Light. With mobile support in latest PyTorch release, I’m considering to switch to it. But I’m not sure if I understand things right, so here are a few questions.

  1. I would want to link libtorch statically into my lib. Is libtorch mobile a minimalistic version of libtorch (kind of like TensorFlow Lite vs TensorFlow)? Or it’s just an interface to Android/iOS build systems and it runs a regular libtorch underneath? From this answer I understand it’s a regular libtorch, but just to be sure.

  2. If it is a separate minimalistic library, how to compile it with C++ interface only (minimal size possible)?

1 Like

Hi,

I had the same question as you. Did you fin anything new? (specially regarding 1.)

Hey, I didn’t find anything back in 2019, and didn’t look into the topic since then.