My PyTorch model takes variable-length inputs. It uses max-pooling at several locations to downsample features.
I would like to export this model to ExecuTorch (*.pte), to run on mobile.
I just cannot for the life of me get the variable-length part working (it can easily export with fixed-length inputs).
There are several tutorials and forums (sorry, new users can only put two links in a post):
- Exporting to ExecuTorch Tutorial — ExecuTorch 0.5 documentation
- torch.export Tutorial — PyTorch Tutorials 2.6.0+cu124 documentation
None of the presented syntaxes solved my issue.
I am using PyTorch version ‘2.6.0+cu124’.
Fixed Length:
dummy_dim = 20000
sample_args = (torch.randn(1,1, dummy_dim),)
ex = torch.export.export(model, sample_args)
test_inp = torch.randn(1, 1, dummy_dim)
ex.module()(test_inp) # → returns model output
Variable Length (this is the only code I managed to export):
dummy_dim = 20000
sample_args = (torch.randn(1,1, dummy_dim),)
dynamic_shapes = {“x”: {2: torch.export.Dim.AUTO}}
ex = torch.export.export(model, sample_args, dynamic_shapes=dynamic_shapes)
test_inp1 = torch.randn(1, 1, dummy_dim)
test_inp2 = torch.randn(1, 1, dummy_dim // 2)
ex.module()(test_inp1) # → returns model output, same shape as dummy
ex.module()(test_inp2) # → RuntimeError: Expected input at *args[0].shape[2] to be equal to 20000, but got 10000
Upon further inspection of the logs, I found the following:
(1) I0405 19:44:30.139000 67963 site-packages/torch/fx/experimental/symbolic_shapes.py:4423] [0/0] create_symbol s0 = 20000 for L[‘x’].size()[2] [5000, 300000] (_dynamo/variables/builder.py:2861 in ), for more info run with TORCHDYNAMO_EXTENDED_DEBUG_CREATE_SYMBOL=“s0” or to suppress this message run with TORCHDYNAMO_EXTENDED_ADVICE=“0”
(2) I0405 19:44:30.256000 67963 site-packages/torch/fx/experimental/symbolic_shapes.py:6281] [0/0] eval Eq(s0 - 250, 19750) [guard added] x = F.max_pool1d(x, self.maxpool) # model.py in forward (nn/functional.py:740 in _max_pool1d)
What I think happens is, a guard is added on the maxpool because the symbolic placeholder cannot verify the shape (it’s dynamic at runtime in the variable-length model in PyTorch). From the output, we can see that the size of the final dimension is always fixed to the length of the dummy input.
I am just not sure of the specific syntax to resolve this error. Can anybody help?