Does torch.onnx support if control flow?

Hi all,

I want to export my RNN Transducer model using torch.onnx.export. However, there is an “if” in my network forward. I have checked the documents and one way to solve the control flow problem is mixing tracing and scripting.

https://pytorch.org/docs/stable/onnx.html#tracing-vs-scripting

I have tried the example and it works well for “for loop”, but there will be errors as following when I use “if”. So I want to know if "if " is supported.

My pytorch version is 1.4.0

graph(%input : Long(2, 3),
%loop : Long(),
%d : Long()):
%3 : Long() = onnx::Constantvalue={3}
%4 : Long() = onnx::Greater(%d, %3) # test2.py:8:7
%5 : Long(2, 3) = onnx::If(%4) # test2.py:8:4
block0():
%6 : LongTensor = onnx::Add(%input, %d) # test2.py:10:12
→ (%6)
block1():
→ (%input)
return (%5)

Traceback (most recent call last):
File “test2.py”, line 31, in
ort_sess = ort.InferenceSession(‘loop.onnx’)
File “/ceph/sz_ts80_new/siningsun/pytorch/espnet/tools/venv/lib/python3.7/site-packages/onnxruntime/capi/session.py”, line 158, in init
self._load_model(providers)
File “/ceph/sz_ts80_new/siningsun/pytorch/espnet/tools/venv/lib/python3.7/site-packages/onnxruntime/capi/session.py”, line 177, in _load_model
self._sess.load_model(providers)
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Failed to load model with error: /onnxruntime_src/onnxruntime/core/graph/graph.cc:912 void onnxruntime::Graph::InitializeStateFromModelFileGraphProto() This is an invalid model. Graph output (input) does not exist in the graph.

1 Like

I also want to know some explanation of this problem

Yes control flow is supported but you have to export your model via scripting instead of tracing

1 Like