Anyway to use c++ api on mobile to do inference task?

As the title mentioned, anyway to compile the c++ api and use them on android/ios? Or there exist prebuild binary?

Hi,
We do have a binary that you can build in order to test/run your model on an android device.
Here is a link to the file (in C++) - https://github.com/pytorch/pytorch/blob/master/binaries/speed_benchmark_torch.cc

You can build the binary by running

./scripts/build_android.sh \
-DBUILD_BINARY=ON \
-DBUILD_CAFFE2_MOBILE=OFF \
-DCMAKE_PREFIX_PATH=$(python -c 'from distutils.sysconfig import get_python_lib; print(get_python_lib())') \
-DPYTHON_EXECUTABLE=$(python -c 'import sys; print(sys.executable)')

To execute you can run

./speed_benchmark_torch --model test.pt --input_dims="1,3,224,224" --input_type=float --warmup=5 --iter 20
1 Like

Any plan to release prebuild lib like linux, mac and windows?

@ngap_wei_Tham
Pytorch mobile is moving towards providing tooling for building custom mobile builds which include only operators that particular model needs. As all-ops default build maybe overshoot by lib size.

1 Like

We have prebuilt libraries for Android and iOS. The iOS tutorial uses the C++ API: https://pytorch.org/mobile/ios/ . We will probably support the C++ API on Android eventually, but for now the easiest path is to use the Java API.

1 Like

I have an example app here showing how to use a pytorch model in C++: