Mobile optimization error

I am trying to execute the retinanet model included in torchvision on an android mobile with Pytorch Mobile. When using the following snippet :

import torch
import torchvision
from torch.utils.mobile_optimizer import optimize_for_mobile

model = torchvision.models.detection.retinanet_resnet50_fpn(pretrained=True)
model.eval()

traced_script_module = torch.jit.script(model)
traced_script_module_optimized = optimize_for_mobile(traced_script_module)

I get the following error :

  File "xxx.py", line 28, in <module>
    traced_script_module_optimized = optimize_for_mobile(traced_script_module)
  File "yyy/torch/utils/mobile_optimizer.py", line 43, in optimize_for_mobile
    optimized_cpp_module = torch._C._jit_pass_optimize_for_mobile(script_module._c, optimization_blocklist, preserved_methods)
RuntimeError: node->kind() == prim::GetAttr INTERNAL ASSERT FAILED at "/opt/conda/conda-bld/pytorch_1603729096996/work/torch/csrc/jit/passes/freeze_module.cpp":22, please report a bug to PyTorch. Expected prim::GetAttr nodes

Is retinanet incompatible with optimize_for_mobile() or is it a bug ? Is there any workaround ?

Could you create an issue on GitHub so that we can track it, please?

Of course, it has been done.