Pytorch LSTM in ONNX.js - Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4

I am trying to run a Pytorch LSTM network in browser. But I am getting this error:

graph.ts:313 Uncaught (in promise) Error: unrecognized input '' for node: LSTM_4
    at t.buildGraph (graph.ts:313)
    at new t (graph.ts:139)
    at Object.from (graph.ts:77)
    at t.load (model.ts:25)
    at session.ts:85
    at t.event (instrument.ts:294)
    at e.initialize (session.ts:81)
    at e.<anonymous> (session.ts:63)
    at onnx.min.js:14
    at Object.next (onnx.min.js:14)

How can I resolve this? Here is my code for saving the model to onnx:

net = torch.load('trained_model/trained_model.pt')
net.eval()

with torch.no_grad():
    input = torch.tensor([[1,2,3,4,5,6,7,8,9]])
    h0, c0 = net.init_hidden(1)
    output, (hn, cn) = net.forward(input, (h0,c0))


    torch.onnx.export(net, (input, (h0, c0)), 'trained_model/trained_model.onnx',
                    input_names=['input', 'h0', 'c0'],
                    output_names=['output', 'hn', 'cn'],
                    dynamic_axes={'input': {0: 'sequence'}})

I put input as the only dynamic axis since it is the only one that can vary in size. With this code, the model saves properly as trained_model.onnx. It does give me a warning:

UserWarning: Exporting a model to ONNX with a batch_size other than 1, with a variable length with LSTM can cause an error when running the ONNX model with a different batch size. Make sure to save the model with a batch size of 1, or define the initial states (h0/c0) as inputs of the model. 
  warnings.warn("Exporting a model to ONNX with a batch_size other than 1, "

This warning a little confusing since I am exporting it with a batch_size of 1:

  • input has shape torch.Size([1, 9]) - corresponding to (batch_size, num_in_sequence)
  • h0 has shape torch.Size([2, 1, 256]) - corresponding to (num_lstm_layers, batch_size, hidden_dim)
  • c0 also has shape torch.Size([2, 1, 256]) - same as h0

But since I do define h0/c0 as inputs of the model I don’t think this relates to the problem.

This is my javascript code for running in the browser:

<script src="https://cdn.jsdelivr.net/npm/onnxjs/dist/onnx.min.js"></script>
<!-- Code that consume ONNX.js -->
<script>
  // create a session
  const myOnnxSession = new onnx.InferenceSession();

  console.log('trying to load the model')

  // load the ONNX model file
  myOnnxSession.loadModel("./trained_model.onnx").then(() => {

    console.log('successfully loaded model!')

    // after this I generate input and run the model
    // since my code fails before this it isn't relevant
  });
</script>

Based on the console.log statements, it is failing to load to the model. How should I resolve this? If relevant I’m using Python 3.8.5, Pytorch 1.6.0, ONNX 1.8.0.