Failed to finish "_jit_pass_onnx" when converting PyTorch model to ONNX

Hi,
Recently, I tried to convert my PyTorch model (customized transformer) to an ONNX one, but the process hung during the conversion.
I found that the hanging issue happend when running this line,

graph = _C._jit_pass_onnx(graph, operator_export_type)  # in pytorch/torch/onnx/utils.py

Currently, there is no error message from this problem. The program just hangs there.

My environment is Windows11 with 32GB RAM. My PyTorch version is 1.12.0. We can do the inference with PyTorch model without any problem.
I used the following command for model conversion.

        torch.onnx.export(model, 
                          tuple(input_data_list), 
                          "onnx.onnx"), 
                          input_names = input_names,
                          output_names='output',
                          opset_version=16)

Please tell me if you know the root cause of the hanging issue. Thanks.

I just solved the problem by myself.
My PyTorch model contained recursive blocks, and I think the root cause might be the “cycle” in the graph. The solution is to convert the encoder and decoder parts separately.

By the way, here is the source code (c++) of _jit_pass_onnx