Inference time mismatch between JAVA and C++ API

Hi PyTorch Community,

I have used pytorch_lite:12.2 for inferencing on a Android Phone i.e., Java version (arm64-v8a) and found out that it took 700 msec for inferencing whereas, if I use C++ API in Android through JNI it is taking 3 seconds to infer a single instance. Can somebody help me regarding the same?