How to convert pretrained .pt extension pytorch model generated by YoloV5 into .ptl (pytorch lite interpreter format) extension model to load on mobile

Hello,

We have a customized model trained by YoloV5, and the default extension save format is .pt. I wonder if there is an appropriate method to convert this model into .ptl model file so that we can deploy it on mobile.

We tried tutorial (Prototype) Introduce lite interpreter workflow in Android and iOS — PyTorch Tutorials 1.9.0+cu102 documentation, but it didn’t work. The code we used are listed below

import torch
from yolov5.models.experimental import attempt_load
path_to_weight_file = '/path/to/our_own_model.pt'
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
# setup model
model = attempt_load(path_to_weight_file, map_location=device)
model.eval()
scripted_module = torch.jit.script(model)
scripted_module._save_for_lite_interpreter("scripted.ptl")

We got the error:

Traceback (most recent call last):
 File "convertToPTL.py", line 24, in <module>
  scripted_module = torch.jit.script(model)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_script.py", line 942, in script
  return torch.jit._recursive.create_script_module(
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 391, in create_script_module
  return create_script_module_impl(nn_module, concrete_type, stubs_fn)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 448, in create_script_module_impl
  script_module = torch.jit.RecursiveScriptModule._construct(cpp_module, init_fn)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_script.py", line 391, in _construct
  init_fn(script_module)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 428, in init_fn
  scripted = create_script_module_impl(orig_value, sub_concrete_type, stubs_fn)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 448, in create_script_module_impl
  script_module = torch.jit.RecursiveScriptModule._construct(cpp_module, init_fn)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_script.py", line 391, in _construct
  init_fn(script_module)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 428, in init_fn
  scripted = create_script_module_impl(orig_value, sub_concrete_type, stubs_fn)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 452, in create_script_module_impl
  create_methods_and_properties_from_stubs(concrete_type, method_stubs, property_stubs)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_recursive.py", line 335, in create_methods_and_properties_from_stubs
  concrete_type._create_methods_and_properties(property_defs, property_rcbs, method_defs, method_rcbs, method_defaults)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_script.py", line 1106, in _recursive_compile_class
  _compile_and_register_class(obj, rcb, _qual_name)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/_script.py", line 65, in _compile_and_register_class
  ast = get_jit_class_def(obj, obj.__name__)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/frontend.py", line 173, in get_jit_class_def
  methods = [get_jit_def(method[1],
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/frontend.py", line 173, in <listcomp>
  methods = [get_jit_def(method[1],
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/frontend.py", line 271, in get_jit_def
  return build_def(ctx, fn_def, type_line, def_name, self_name=self_name)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/frontend.py", line 293, in build_def
  param_list = build_param_list(ctx, py_def.args, self_name)
 File "/Users/yin/Library/Python/3.8/lib/python/site-packages/torch/jit/frontend.py", line 320, in build_param_list
  raise NotSupportedError(ctx_range, _vararg_kwarg_err)
torch.jit.frontend.NotSupportedError: Compiled functions can't take variable number of arguments or use keyword-only arguments with defaults:
 File "/Applications/Xcode.app/Contents/Developer/Library/Frameworks/Python3.framework/Versions/3.8/lib/python3.8/warnings.py", line 477
  def __exit__(self, *exc_info):
            ~~~~~~~~~ <--- HERE
    if not self._entered:
      raise RuntimeError("Cannot exit %r without entering first" % self)
'__torch__.warnings.catch_warnings' is being compiled since it was called from 'SPPF.forward'
 File "/Users/yin/Desktop/real-time parking sign detection/yolov5/models/common.py", line 191
  def forward(self, x):
    x = self.cv1(x)
    with warnings.catch_warnings():
       ~~~~~~~~~~~~~~~~~~~~~~~ <--- HERE
      warnings.simplefilter('ignore') # suppress torch 1.9.0 max_pool2d() warning
      y1 = self.m(x)
1 Like

I have the same issue when converting .pt to .ptl for running on android app. Did you already fix it?

Refer to thread Convert pytorch model to ptl

You might have to check if your model definition has variable number of arguments