How to save tensors on mobile with lite interpreter

Based on the discussion in #30108 it’s clear that pickle_save is not supported on mobile, because /csrc/jit/serialization/export.cpp is not included when building for lite interpreter; producing the following runtime error:

AT_ERROR( "pickle_save not supported on mobile " "(see"); 

For loading there’s an option of using torch::jit::_load_for_mobile.

However, are there any methods, or alternative approaches, of serialising and saving c10::IValue objects on the mobile device?

lite interpreter doesn’t support saving models. The model is generated from desktop and deployed on mobile device. See the deployment workflow in Home | PyTorch

While true, the answer is misleading. I modified the PyTorch source code a little bit to enable save support for the lite interpreter. I’m surprised it’s not allowed by default. It appears to work without any issues when enabled.