Recently while converting the model to onnx format I got this stack trace
[W lower_tuples.cpp:251] Warning: tuple appears in the op outputs, but this op does not forward tuples, unsupported kind: prim::unchecked_cast (function flattenOutputs)
Traceback (most recent call last):
File "tools/export_as_trt.py", line 153, in <module>
main()
File "tools/export_as_trt.py", line 138, in main
torch.onnx.export(
File "/opt/conda/lib/python3.8/site-packages/torch/onnx/__init__.py", line 350, in export
return utils.export(
File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 163, in export
_export(
File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 1074, in _export
graph, params_dict, torch_out = _model_to_graph(
File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 731, in _model_to_graph
graph = _optimize_graph(
File "/opt/conda/lib/python3.8/site-packages/torch/onnx/utils.py", line 234, in _optimize_graph
_C._jit_pass_lower_all_tuples(graph)
RuntimeError: prim::TupleUnpack not matched to tuple construct
Though the issue mentions an mismatch of operation with tuple output. It is not clear as to which part of the code throws this error and hence it is difficult to debug.
Any thoughts on how to go about this?