Bidirectional LSTM and ONNX runtime warnings

Hi,

I was trying to export a model that includes bidirectional LSTM layers as a part of it. Whenever I try to export it as .onnx and even when the model does export, I get a few warnings that I am not sure how to get rid of them:

/opt/anaconda3/envs/onnx/lib/python3.8/site-packages/torch/onnx/symbolic_opset9.py:2119: UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.
  warnings.warn("Exporting a model to ONNX with a batch_size other than 1, " +
WARNING: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function.
/opt/anaconda3/envs/onnx/lib/python3.8/site-packages/torch/onnx/symbolic_helper.py:716: UserWarning: allowzero=0 by default. In order to honor zero value in shape use allowzero=1
  warnings.warn("allowzero=0 by default. In order to honor zero value in shape use allowzero=1")
WARNING: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function.
WARNING: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function.

I believe that the first part of the warning is just to prevent running the model with an inappropriate batch size and shows up even if the model’s batch size is correctly set:

UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model.

However, I am not sure about what this warning means and why it shows several times:

WARNING: The shape inference of prim::Constant type is missing, so it may result in wrong shape inference for the exported graph. Please consider adding it in symbolic function.

If I look at the output graph there seems to be a prim::Constant tensor that apparently is going nowhere and shows only once along the whole graph output:

%19 : Tensor? = prim::Constant()

Here is the minimal code to reproduce this output. I am using opset14 but it also happens with other versions:

import torch
import torch.onnx
import torch.nn as nn


class MyNet(nn.Module):
    def __init__(self):
        super().__init__()
        self.bilstm = nn.LSTM(256, 128, 
                              num_layers=2, 
                              bidirectional=True)
    
    def forward(self, x: torch.Tensor, h: torch.Tensor, c: torch.Tensor):
        x, (h, c) = self.bilstm(x, (h, c))
        return x, h, c


if __name__ == "__main__":
    x = torch.randn([1, 1, 256], dtype=torch.float32)
    h = torch.randn([2 * 2, 1, 128], dtype=torch.float32)
    c = torch.randn([2 * 2, 1, 128], dtype=torch.float32)
    net = MyNet()
    net.eval()
    net(x, h, c)

    onnx_net = torch.onnx.export(net, 
                                 (x, h, c),
                                 "onnx_warn_test.onnx",
                                 input_names=["x_in", "h_in", "c_in"],
                                 output_names=["x_out", "h_out", "c_out"],
                                 opset_version=14,
                                 verbose=True)

Thanks a lot for your help!

Have you solved this problem

Hi,

I am still not sure about its origin. Since the model can still be exported and it is a warning and not an error, I tried it and it seems to work correctly.

I have the same problem. I want to use this onnx model in unity by Brracuda, it doesn’t work.