Unequivalent exported ONNX format network

I have trained a binary-weight network and would like to export it into ONNX format. However, some convolution layers are changed in the exported ONNX model. They are different than the original in a sense that they combine the next batchnorm layer with this one and have non-binary weight and bias, which doesn’t exist in the trained network.
The export command I use is
x=Variable(torch.randn(1,3,32,32,requires_grad=True,device=‘cuda’)) torch.onnx.export(net,x,“CNN_Medium.onnx”,export_params=True,verbose=True,input_names = [‘input’], output_names = [‘output’])
Does anybody have similar issues? Are there any solutions?