Anyway to use c++ api on mobile to do inference task?

As the title mentioned, anyway to compile the c++ api and use them on android/ios? Or there exist prebuild binary?

We do have a binary that you can build in order to test/run your model on an android device.
Here is a link to the file (in C++) -

You can build the binary by running

./scripts/ \
-DCMAKE_PREFIX_PATH=$(python -c 'from distutils.sysconfig import get_python_lib; print(get_python_lib())') \
-DPYTHON_EXECUTABLE=$(python -c 'import sys; print(sys.executable)')

To execute you can run

./speed_benchmark_torch --model --input_dims="1,3,224,224" --input_type=float --warmup=5 --iter 20
1 Like

Any plan to release prebuild lib like linux, mac and windows?

Pytorch mobile is moving towards providing tooling for building custom mobile builds which include only operators that particular model needs. As all-ops default build maybe overshoot by lib size.

1 Like

We have prebuilt libraries for Android and iOS. The iOS tutorial uses the C++ API: . We will probably support the C++ API on Android eventually, but for now the easiest path is to use the Java API.

1 Like

I have an example app here showing how to use a pytorch model in C++: