Converting to onnx with dynamic batch has trouble

i converted this model to onnx with dynamic batch as below ,and it succeeded.
The model come from https://github.com/facebookresearch/detr.git
and my converted model is in https://drive.google.com/file/d/12v98uC0jxfktdRc3aJnpuwhT0k2Yj7P7/view?usp=sharing

my script as below:

output_onnx = 'det.onnx'
print("==> Exporting model to ONNX format at '{}'".format(output_onnx))
input_names = ["input"]
output_names = ["scores", "boxes"]
dynamic_dic = {'input': {0: 'batch_size'}, 'scores': {0: 'batch_size'}, 'boxes': {0: 'batch_size'}}

inputs = torch.randn(1, 3, 800, 800).to('cpu')
torch_out = torch.onnx.export(net, inputs, output_onnx, export_params=True, verbose=False,
                               input_names=input_names, output_names=output_names, opset_version=11, dynamic_axes=dynamic_dic)

onnx_model = onnx.load('det.onnx')
print(onnx_model.graph.input[0])


However, when i run it in triton server using onnxruntime_onnx, i got an error that told me outputs of the model have fixed dimension but dynamic .    And i don`t know what is wrong with it?