Dynamic Inference with Channel indexing is very slow

Hi guys,
I don’t know whether anyone has encountered this before, but in an attempt to boost inference speed, I’m using channel indexing for weight selection for dynamic inference following this example:

However, the inference speed of a Resnet18 with only 0.25x width measured on a mobile device is worst than the full vanilla model
Any thought?