Errors while converting Def-DETR pth to .onnx

I am attempting convert a Def-DETR model to onnx.

batch_dim = Dim(“batch_size”, min=1, max=4)
images_dim = Dim(“dimensions”)
images_height = Dim(“height”)
images_width = Dim(“width”)

torch.onnx.export(model,         # model being run 
    model_inputs,       # model input (or a tuple for multiple inputs) 
    "model.onnx",       # where to save the model  
    export_params=True,  # store the trained parameter weights inside the model file 
    opset_version=16,    # the ONNX version to export the model to 
    input_names=['images', 'masks'],   # the model's input names 
    output_names = ['out'], # the model's output names 
    dynamo=True,
    dynamic_shapes={
        "images": {0: batch_dim, 1: images_dim, 2: images_height, 3: images_width},
        "masks": {0: batch_dim, 1: images_height, 2: images_width}
    },
    # report=True,
    # verify=True,
    verbose=True
)

I am facing the below error.

Exception summary

<class ‘torch.fx.experimental.symbolic_shapes.GuardOnDataDependentSymNode’>: Could not extract specialized integer from data-dependent expression u0 (unhinted: u0). (Size-like symbols: none)

Caused by: (_export/non_strict_utils.py:557 in torch_function)
For more information, run with TORCH_LOGS=“dynamic”
For extended logs when we create symbols, also add TORCHDYNAMO_EXTENDED_DEBUG_CREATE_SYMBOL=“u0”
If you suspect the guard was triggered from C++, add TORCHDYNAMO_EXTENDED_DEBUG_CPP=1
For more debugging help, see Dealing with GuardOnDataDependentSymNode errors - Google Docs

For C++ stack trace, run with TORCHDYNAMO_EXTENDED_DEBUG_CPP=1

The following call raised this error:
File “models/deformable_transformer.py”, line 241, in get_reference_points
ref_y, ref_x = torch.meshgrid(torch.linspace(0.5, H_ - 0.5, H_, dtype=torch.float32, device=device),

(Refer to the full stack trace above for more information.)

Would be grateful if i get a solution. Thanks in advance!