ONNX model inference produces different results for the idential input

After a Pytorch model is converted to ONNX and then loaded in onnxruntime.

For the idential input, it produces different results in ONNX model. (works in PyTorch)

Please refer to this reproducible notebook

I worked out the issue being the use of PReLU, not sure about the fix yet.