Unable to export gru model in onnx accepting variable length sequence

Hello, I am trying to export in onnx a Module using a simple Gru and some linear layers on top of it. The module is trained using packed sequences so there is no fixed length sequence at training time.

I can export the model my_model using onnx

input_names = [ "time series" ] + [ f"learned_{i}" for i, p in enumerate(my_model.parameters()) ]
output_names = [ "log probability" ]

dummy_input = torch.randn(100, device='cuda').view(-1, 1, 1)
dummy_output = torch.randn(100, 3, device='cuda').view(-1, 1, 3)

my_model_jit= torch.jit.script(my_model)
torch.onnx.export(my_model_jit, dummy_input, "my_model.onnx", verbose=True, input_names=input_names, output_names=output_names, example_outputs=dummy_output)

The issue issue is that when I load my exported model then I can only do inference for fixed length series of length 100 (which is the size I randomly picked for the example)

import onnxruntime as rt
sess = rt.InferenceSession(r"my_model.onnx")

input_name = sess.get_inputs()[0].name

# this is working fine with length = 100
ts = np.random.randn(100).cumsum().astype(np.float32).reshape( -1, 1, 1)
pred_onx = sess.run(None, {input_name: ts})

# this is failing with length != 100
ts = np.random.randn(110).cumsum().astype(np.float32).reshape( -1, 1, 1)
pred_onx = sess.run(None, {input_name: ts})

ouput :
---------------------------------------------------------------------------
InvalidArgument                           Traceback (most recent call last)
<ipython-input-162-da0f71c171b2> in <module>()
      1 ts = np.random.randn(110).cumsum().astype(np.float32).reshape( -1, 1, 1)
----> 2 pred_onx = sess.run(None, {input_name: ts})

c:\homeware\lib\site-packages\onnxruntime\capi\session.py in run(self, output_names, input_feed, run_options)
    134             output_names = [output.name for output in self._outputs_meta]
    135         try:
--> 136             return self._sess.run(output_names, input_feed, run_options)
    137         except C.EPFail as err:
    138             if self._enable_fallback:

InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Got invalid dimensions for input: time series for the following indices
 index: 0 Got: 110 Expected: 100
 Please fix either the inputs or the model.

Is there a way to have a model without this limitation ie working like in pytorch ?

Thanks a lot

I have found how to do it and I was basically over-complicating something simple. You just need to specify dynamic_axes in your torch.onnx.export function. In particular, there is no need to jit the model and trace based export works fine.
In my example I should have done this (making sure that the batch_first in the GRU init is set to False

dummy_input = torch.randn(100).reshape(1, -1, 1)
torch.onnx.export(my_model_LEFT_AS_NN_MODULE,
                    (dummy_input, ),
                    'test_rnn.onnx',
                    verbose=True,
                    input_names=['input'],
                    output_names=['output',],
                    dynamic_axes={'input': {1: 'sequence'}, 'output': {1: 'sequence'} } )

And then to use it simply do

import onnxruntime
import numpy as np
sess = onnxruntime.InferenceSession("test_rnn.onnx")
test_input = np.random.randn(100).astype(np.float32).reshape( 1, -1, 1)  
sess.run(["output"], {'input': test_input }))
test_input_different_size = np.random.randn(10).astype(np.float32).reshape( 1, -1, 1)  
sess.run(["output"], {'input': test_input_different_size }))

I should have read the doc ; )
Hope it helps if anybody stumbles upon the same issue