ONNX Export Difference

I’ve exported my model to onnx and verifying the exported model by torch.linalg.norm( torch_out - onnx_out) . These outputs are inferred from torch.ones tensors. I get difference norms below,

center diff: 0.0001215097145177424, tensor shape: torch.Size([1, 2, 468, 468])
center_z diff: 0.000555071048438549, tensor shape: torch.Size([1, 1, 468, 468])
dim diff: 0.0008262245682999492, tensor shape: torch.Size([1, 3, 468, 468])
rot diff: 0.0005295916926115751, tensor shape: torch.Size([1, 2, 468, 468])
hm diff: 0.0025102065410465, tensor shape: torch.Size([1, 3, 468, 468])
VFE diff: 0.12264608591794968, tensor shape: torch.Size([10000, 64])

Torch export and inference were performed while model was in eval() and with torch.no_grad()

Are these differences expected and reasonable or might something be wrong?

Thanks!

@Orcun_Deniz can you provide the model source code. This will help us understand where the mismatch might be getting triggered

Here, I uploaded the project to git; https://github.com/OrcunCanDeniz/OpenPCDet_Exports/blob/main/centerpoint_exporter.py does the export. You can comment this line to avoid loading weights, this does not change the behaviour.
I break down the model into 2 parts and took 2 exports. One of those seem to be work ok. But the one on this block is the problematic one. And this is the file is where source of that module. Finally that’s my onnx model.

Note: I am also exporting a TorchScript of the model and do not face any issue.