Quantization parameters sharing / Onnx node attributes

Hi everyone.

Before Onnx, I used Caffe prototxt to share models with colleagues. Human readable format like prototxt was convenient to add some custom attributes to any node of the graph. In particular, we used the Ristretto conventions to add quantization parameters into the prototxt.

These quantization parameters (to convert my model from float parameter to a fixed-point representation on 8 or 16 bits for instance) are computed by myself, after analyzing my model. These quantization parameters, along with my floating point parameters, were then used by the hardware guys in my company to run the network on INT only capable hardware.

Onnx does not define any standard way to define quantization parameters (or i couldn’t find googling).
Hence my question : how can I share a pytorch network (which has floating point parameters) and the quantization parameters (one set of parameters for every layer) ?

The options i see are :

  • sharing an onnx model. Then i have to find a way to add arbitrary attributes to onnx node attributes. That would be the neatier option cause it would allow model exchange between different framwork/languages

  • add quantization parameters as a dictionary to my python layers and save and share the ‘.pth’ file exported by torch.save. This has the big drawback of being more dependant on python and not being standard : the guy i send the file to either needs to make a forward pass to get the graph computation or i need to send him the definition of the Network class

  • sharing the onnx model without quantiztion params, and joining the latter in a separate text file. However, I need to make a correspondance between ops in my onnx model and ops in the text file (so that the quantization param files can be automatically generated too). But i don’t know how to do that.

So here am I.

Do you guys know what’s the easiest way ?
Is there an option i did not consider, or which of these options should i investigate further ?

Thank you !

1 Like