To be precise, in one case I will try to run the NNAPI model deployed not lite in an interpreter built with BUILD_LITE_INTERPRETER=0 (successful), and in the other case the NNAPI model deployed lite in an interpreter built with BUILD_LITE_INTERPRETER=1 (failure).
I cannot run he NNAPI deployed lite model in an interpreter built with BUILD_LITE_INTERPRETER=0 because that doesn’t provide torch::jit::_load_for_mobile
(the app fails to build at link time)
Note that my colleague Aurelien reported the same _nnapi.Compilation
error for the distributed version of pytorch lite Linking errors with Pytorch Lite 1.9.0 in Android native app - #3 by Aurelien. I am not sure the fact that NNAPI was prototype is the reason it didn’t make it into the lite version of the library, as it is there in the non lite version.