Network inference on GPU under Android

Does PyTorch support such type of inference?

This is currently not supported, but we are actively looking into supporting inference on GPU on mobile devices.
Stay tuned for future releases!

1 Like

Any updates on this?

1 Like

I recently read about ONNX runtime and it might be an alternative.

PyTorch could export models to ONNX format and run the model with ONNX runtime.
https://pytorch.org/docs/stable/onnx.html

According to the onnxruntime, NNAPI is supported including CPU and GPU inference.

I didn’t try it but looks like it is the only Android gpu-enabled accelerator for now.

1 Like