ONNX export gives different results than python

I’m using the k-diffusion transformer model, and I’ve found that the ONNX export gives significantly different results to the torch model. I had some warnings about to begin with but now I’ve fixed those. The export seems to work correctly and the onnx.check_model doesn’t raise an error.

Here a part of the pytorch output (right) and the ONNX output (left):

Clearly there is some similarity, but there is a larger difference than I would have expected. Both TensorRT and ONNXRuntime are giving the same result, which differs from PyTorch.