Error about custom build. When runing quanted model on android

My quanted model “” runing well on android without custom build.
Use torch.jit.export_opnames() to get SELECTED_OP_LIST,but there is no Relu op such as
“- aten::relu”.Try Relu6 meet the same question.

torch version:1.5.0
Here is the RuntimeError:

java.lang.RuntimeException: The following operation failed in the TorchScript interpreter.
    Traceback of TorchScript, serialized code (most recent call last):
      File "code/__torch__/torch/nn/quantized/", line 9, in forward
      if inplace:
        _1 = torch.relu_(input)
             ~~~~~~~~~~~ <--- HERE
        _1 = torch.relu(input)
    Traceback of TorchScript, original code (most recent call last):
      File "###/python3.7/site-packages/torch/nn/quantized/", line 322, in forward
            raise ValueError("Input to 'quantized.relu' must be quantized!")
        if inplace:
            return torch.relu_(input)
                   ~~~~~~~~~~~ <--- HERE
            return torch.relu(input)
    RuntimeError: Operator has been stripped in the custom build.

Someone can help,thank you!

i changed the relu inplace=False,it will work