Hi,
Recently, I tried to convert my PyTorch model (customized transformer) to an ONNX one, but the process hung during the conversion.
I found that the hanging issue happend when running this line,
graph = _C._jit_pass_onnx(graph, operator_export_type) # in pytorch/torch/onnx/utils.py
Currently, there is no error message from this problem. The program just hangs there.
My environment is Windows11 with 32GB RAM. My PyTorch version is 1.12.0. We can do the inference with PyTorch model without any problem.
I used the following command for model conversion.
torch.onnx.export(model,
tuple(input_data_list),
"onnx.onnx"),
input_names = input_names,
output_names='output',
opset_version=16)
Please tell me if you know the root cause of the hanging issue. Thanks.