I am trying to convert a pytorch based model into onnx format. The pytorch model is working fine, but I can’t convert it into onnx. I am converting self.model
into onnx
format like this:
random_input = torch.randn([8, 6, 224, 224], device=self.device, requires_grad=True)
onnx_model_name = "mi_volo.onnx"
# pytorch to onnx
torch.onnx.export(self.model,
random_input,
onnx_model_name,
verbose=True,
opset_version=18,
export_params=True,
input_names = ['input'],
output_names = ['output'],)
but I am getting the following error:
============= Diagnostic Run torch.onnx.export version 2.0.1+cu117 =============
verbose: False, log level: Level.ERROR
======================= 0 NONE 0 NOTE 0 WARNING 0 ERROR ========================
Traceback (most recent call last):
File "/home/master/.local/lib/python3.10/site-packages/torch/onnx/symbolic_opset18.py", line 52, in col2im
num_dimensional_axis = symbolic_helper._get_tensor_sizes(output_size)[0]
TypeError: 'NoneType' object is not subscriptable
I tried debugging the error, but couldn’t understand it due to less familiarity with the conversion process. The actual line that is causing an error inside torch/onnx/symbolic_opset18.py
is
num_dimensional_axis = symbolic_helper._get_tensor_sizes(output_size)[0]
At the time of the error, I tried inspecting the output_size
variable and it is
1072 defined in (%1072 : int[] = prim::ListConstruct(%958, %963), scope: mivolo.model.mivolo_model.MiVOLOModel::/torch.nn.modules.container.Sequential::network.0/timm.models.volo.Outlooker::network.0.0/timm.models.volo.OutlookAttention::attn)
Debugging deeper, inside _get_tensor_sizes
method, the following branch is causing a None
return.
if not _is_tensor(x) or x.type() is None:
return None
MiVOLO - Latest Pull
Pytorch version - 2.0.1
onnx version - 1.14.1
OS - Ubuntu 22.04.3LTS
Any help/direction/discussion will be highly appreciated, thank you.