Saving a model compiled with Torch-TensorRT with dynamic inputs

I can successfully compile a model with dynamic inputs using Torch-TensorRT, as specified in the docs:

import torch_tensorrt as trt
inputs = [trt.Input(min_shape=(1, 1, 28, 28), 
                   opt_shape=(50, 1, 28, 28), 
                   max_shape=(64, 1, 28, 28), 
                   dtype=torch.float32)]

exp_program = trt.dynamo.trace(model_from_state, inputs)
trt_gm = trt.dynamo.compile(exp_program, inputs=inputs)

However, if I try to save the model using torch_tensorrt.save (as specified here), the inputs parameter requires torch tensors:
trt.save(trt_gm, "trt_model.ep", inputs=inputs)
This code throws:

ValueError: Not all inputs provided are torch.tensors. Please provide torch.tensors as inputs

What is the correct way to save a model compiled AOT with Torch-TensorRT which has dynamic inputs? In theory, providing the dynamic inputs using torch_tensorrt.Input is supposed to handle the dynamic “Dim” object creation internally and attach it to the compiled graph, but I don’t see a way to save the resultant model.