Error when Exporting Model to Onnx format

Hey there PyTorch community its my first time posting and I’m very new to ML but I’ve come across a problem which i haven’t been able to find a solution for so i decided to ask here.

So I’ve been testing out LSTM to learn how it works and wanted to export a model using it to onnx to test out, but whenever I try to export I get the following message with the program closing:

“UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.”

upon research i haven’t found anything which has helped with this issue and honestly I don’t even know what to try considering I cant find any information online and i don’t fully understand the error message.

here is my code for exporting:

def export_model(model):
    with torch.inference_mode():
        x = torch.randn(1, 1, 3)
        h0 = torch.zeros(4, 1, 8)
        c0 = torch.zeros(4, 1, 8)
        torch_out, hidden = model(x, (h0, c0))
        input_names = ["input", "h0", "c0"]
        output_names = ["output", "hn", "cn"]

        torch.onnx.export(model, (x, (h0, c0)), 'lstm.onnx',
                    export_params=True,
                    input_names=['input', 'h0', 'c0'],
                    output_names=['output', 'hn', 'cn'])
    print(f"Exporting Model: lstm.onnx")

If anyone has any insights that would be greatly appreciated.
Thanks :slight_smile:

1 Like