Does torch.onnx support if control flow?

Hi all,

I want to export my RNN Transducer model using torch.onnx.export. However, there is an “if” in my network forward. I have checked the documents and one way to solve the control flow problem is mixing tracing and scripting.

I have tried the example and it works well for “for loop”, but there will be errors as following when I use “if”. So I want to know if "if " is supported.

My pytorch version is 1.4.0

graph(%input : Long(2, 3),
%loop : Long(),
%d : Long()):
%3 : Long() = onnx::Constantvalue={3}
%4 : Long() = onnx::Greater(%d, %3) #
%5 : Long(2, 3) = onnx::If(%4) #
%6 : LongTensor = onnx::Add(%input, %d) #
-> (%6)
-> (%input)
return (%5)

Traceback (most recent call last):
File “”, line 31, in
ort_sess = ort.InferenceSession(‘loop.onnx’)
File “/ceph/sz_ts80_new/siningsun/pytorch/espnet/tools/venv/lib/python3.7/site-packages/onnxruntime/capi/”, line 158, in init
File “/ceph/sz_ts80_new/siningsun/pytorch/espnet/tools/venv/lib/python3.7/site-packages/onnxruntime/capi/”, line 177, in _load_model
onnxruntime.capi.onnxruntime_pybind11_state.InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Failed to load model with error: /onnxruntime_src/onnxruntime/core/graph/ void onnxruntime::Graph::InitializeStateFromModelFileGraphProto() This is an invalid model. Graph output (input) does not exist in the graph.

1 Like

I also want to know some explanation of this problem

Yes control flow is supported but you have to export your model via scripting instead of tracing

1 Like