Mobile optimized model raised error "required rank 4 tensor to use channels_last format"

I am trying to export a custom model (VGG + LSTM) from pytorch to an Android phone. I am using pytorch 1.9 both on the laptop and the android device.

The exported model is optimized with this:

traced_model = torch.jit.trace(rnn_model, example_inputs=x_dummy)
traced_model = optimize_for_mobile(traced_model)

But when I run model.forward(…) on the android device I get the following error:

Pending exception com.facebook.jni.CppException: required rank 4 tensor to use channels_last format

Everything works fine without optimization. Any idea?

Have you tested the optimized model in Python? @kimishpatel any idea?

HOw did you generate input? Can you provide repro steps? Also was the model saved for mobile or you just used torch.jit.save?