Torch.jit.script is failing due to Sigmoid?

I’m following (Beta) Convert MobileNetV2 to NNAPI — PyTorch Tutorials 2.0.1+cu117 documentation to get my model to work on android gpu device but I’m using

        traced = torch.jit.script(model, input_tensor)

Instead of trace, as it’s supposed to be more suitable for my model architecture (YoloX),

The trace is failing on the following function:

Arguments for call are not valid.
The following variants are available:
  
  aten::sigmoid(Tensor self) -> (Tensor):
  Expected a value of type 'Tensor' for argument 'self' but instead found type '__torch__.yolox.models.network_blocks.SiLU (of Python compilation unit at: 0x3516fd0)'.
  
  aten::sigmoid.out(Tensor self, *, Tensor(a!) out) -> (Tensor(a!)):
  Expected a value of type 'Tensor' for argument 'self' but instead found type '__torch__.yolox.models.network_blocks.SiLU (of Python compilation unit at: 0x3516fd0)'.

The original call is:
  File "/home/josephj/srcs/cv_training_code/pytorch/yolox/yolox/models/network_blocks.py", line 14
    @staticmethod
    def forward(x):
        sig = torch.sigmoid(x)
              ~~~~~~~~~~~~~ <--- HERE
        res = x * sig
        return res

This is a costum SILU implementation that replaces the pytorch SILU as it creates issues issues when exporting to onnx/tflite or when trying to run

    nnapi_model = torch.backends._nnapi.prepare.convert_model_to_nnapi(traced, input_tensor)

The full SiLU implementation is

class SiLU(nn.Module):
    """export-friendly version of nn.SiLU()"""
    @staticmethod
    def forward(x):
        sig = torch.sigmoid(x)
        res = x * sig
        return res

Any ideas how to correct this issue?

Don’t make forward static (and add self as first argument) and you should be OK.