Pytorch model deployment in mediapipe

Hello Experts,

I am trying to deploy a pytorch model into mediapipe. Was wondering if anyone has done such deployment into mediapipe and used it? As of now, i have mediapipe installed in my ubuntu…


If you can convert your model to a format supported by mediapipe, that’s the easiest solution. I’ve toyed around with writing a calculator for mediapipe supporting pytorch here: It’s just a very basic setup, but it can easily be extended and refactored into separate modules for tensor conversion, inference, etc.

The biggest challenge I ran into is the build environment: pytorch has a bazel environment for linux, so that’s not too hard to get working. However, as far as I can tell, you need to use cmake for the Android/iOS builds, so that’s more challenging. What finally stopped me is the lack of GPU support on mobile in pytorch ATM.