Failing to export the Swin Transformer model into the Onnx file

Hi everyone. I want to export the Pytorch Swin Transformer model into the Onnx format. However, I got the following error:

Traceback (most recent call last):
  File "/content/Rethinking_of_PAR/export_onnx.py", line 163, in <module>
    main(cfg, args)
  File "/content/Rethinking_of_PAR/export_onnx.py", line 105, in main
    torch.onnx.export(model, x, "swin_b.onnx", opset_version=12)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py", line 516, in export
    _export(
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py", line 1613, in _export
    graph, params_dict, torch_out = _model_to_graph(
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py", line 1139, in _model_to_graph
    graph = _optimize_graph(
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py", line 677, in _optimize_graph
    graph = _C._jit_pass_onnx(graph, operator_export_type)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py", line 1957, in _run_symbolic_function
    return symbolic_fn(graph_context, *inputs, **attrs)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/symbolic_opset9.py", line 7153, in onnx_placeholder
    return torch._C._jit_onnx_convert_pattern_from_subblock(block, node, env)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/utils.py", line 1957, in _run_symbolic_function
    return symbolic_fn(graph_context, *inputs, **attrs)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/symbolic_opset11.py", line 311, in index_put
    values = symbolic_helper._reshape_helper(g, values, values_shape)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/symbolic_helper.py", line 1418, in _reshape_helper
    return g.op("Reshape", input, shape)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/_internal/jit_utils.py", line 87, in op
    return _add_op(self, opname, *raw_args, outputs=outputs, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/_internal/jit_utils.py", line 246, in _add_op
    node = _create_node(
  File "/usr/local/lib/python3.10/dist-packages/torch/onnx/_internal/jit_utils.py", line 307, in _create_node
    _C._jit_pass_onnx_node_shape_type_inference(node, params_dict, opset_version)
RuntimeError: minus_one_pos != -1 INTERNAL ASSERT FAILED at "../torch/csrc/jit/passes/onnx/shape_type_inference.cpp":534, please report a bug to PyTorch. There are no examples for shape_has_zero = true && minus_one_pos == -1.

The default input size of the model is (batch_size, 3, 256, 192). When I use other input sizes like : (batch_size, 3, 256, 256) or (batch_size, 3, 256, 255), it works without any errors. However, I get the mentioned error with the default input size(batch_size, 3, 256, 192).
Could you please help me to fix this problem?