Mobile optimized model raised error "required rank 4 tensor to use channels_last format"

I am trying to export a custom model (VGG + LSTM) from pytorch to an Android phone. I am using pytorch 1.9 both on the laptop and the android device.

The exported model is optimized with this:

traced_model = torch.jit.trace(rnn_model, example_inputs=x_dummy)
traced_model = optimize_for_mobile(traced_model)

But when I run model.forward(…) on the android device I get the following error:

Pending exception com.facebook.jni.CppException: required rank 4 tensor to use channels_last format

Everything works fine without optimization. Any idea?

Have you tested the optimized model in Python? @kimishpatel any idea?

HOw did you generate input? Can you provide repro steps? Also was the model saved for mobile or you just used

Hi! I am having the same issue using a model on iOS. I found this page explaining the difference between the memory formats, and indicating a conversion method:

x =

However, applying this to the model output does not help, as I am getting the same error:

required rank 4 tensor to use channels_last format

Anybody encountered the same issue? I have no idea what am I doing wrong.

As the error message states, a 4D tensor is required while it seems your x tensor has less or more dimensions.


Thank you! I totally misunderstood the error message.