Model training on mobile devices

Hi, I’m interested in implementing federated learning in an Android app and I’m hoping I can use pytorch for this.
The main problem I have is that it does not seem to be possible to build pytorch library for Android that can do training and not just inference.
Another problem is that the app is very constrained in terms of binary size it can allocate for this, so I would like to know if it’s possible to build a bare minimum version of libtorch for Android that is <10MB in size.

Is this something that’s possible or should I try something else?

Thank you.

To build the bare minimum version of libtorch for Android, lite interpreter probably worths a try. Here is the link to the recipe (Prototype) Introduce lite interpreter workflow in Android and iOS — PyTorch Tutorials 1.8.0 documentation. Turn on BUILD_LITE_INTERPRETER can help reducing the size. lite interpreter + custom build will reduce size even more, since it build libtorch only with the operators used by models.

As a note, lite interpreter is in prototype stage and still under development.

In terms of training, @iseeyuan do you know the answer?

As @cccclai mentioned we are building light-weighted runtime on both Android and iOS. It’s not super difficult to add training capability. Actually we have tests running in test/cpp/jit/test_lite_trainer.cpp, for functionality only. We’ll let you know when we put the necessary components into Android build.