ONNX model inference produces different results for the idential input

After a Pytorch model is converted to ONNX and then loaded in onnxruntime.

For the idential input, it produces different results in ONNX model. (works in PyTorch)

Please refer to this reproducible notebook

I worked out the issue being the use of PReLU, not sure about the fix yet.

Could you please elaborate a bit on what exactly it the problem with PReLU?

I have the same problem and my model contains PReLU as well.