TensorImageUtils.bitmapToFloat32Tensor Function time on android

The development environment is Android.

I am detecting the image object through the TensorImageUtils.bitmapToFloat32Tensor(resizedBitmap,PrePostProcessor.NO_MEAN_RGB,PrePostProcessor.NO_STD_RGB,MemoryFormat.CHANNELS_LAST) function for the bitmap I have.

However, since it takes about 2 seconds in the TensorImageUtils.bitmapToFloat32Tensor function, there is a delay in detection compared to the actual real-time image.

Is there a faster way to process the TensorImageUtils.bitmapToFloat32Tensor function?