Error about custom build. When runing quanted model on android

My quanted model “quanted.pt” runing well on android without custom build.
Use torch.jit.export_opnames() to get SELECTED_OP_LIST,but there is no Relu op such as
“- aten::relu”.Try Relu6 meet the same question.

torch version:1.5.0
Here is the RuntimeError:

java.lang.RuntimeException: The following operation failed in the TorchScript interpreter.
    Traceback of TorchScript, serialized code (most recent call last):
      File "code/__torch__/torch/nn/quantized/functional.py", line 9, in forward
        pass
      if inplace:
        _1 = torch.relu_(input)
             ~~~~~~~~~~~ <--- HERE
      else:
        _1 = torch.relu(input)
   
    Traceback of TorchScript, original code (most recent call last):
      File "###/python3.7/site-packages/torch/nn/quantized/functional.py", line 322, in forward
            raise ValueError("Input to 'quantized.relu' must be quantized!")
        if inplace:
            return torch.relu_(input)
                   ~~~~~~~~~~~ <--- HERE
        else:
            return torch.relu(input)
    RuntimeError: Operator has been stripped in the custom build.

Someone can help,thank you!

i changed the relu inplace=False,it will work