A way to get your networks to Mobile and the Web

After digging a bit, i find this process best for finally compressing my models and getting them into a browser:

  1. Train using float32 type
  2. Export the model to onnx (or to pth and later on to onnx.)
  3. Convert using onnx converter from onnx to ort format (reduces the size to half.)

I wonder if others are doing the same, or following different strategies.