escorciav
(Victor Escorcia)
1
Hi!
pytorch onnx (opset 9) unfolds L2Normalization as multiple operators instead of using LpNormalization operator of ONNX. Is there any clean workaround?
I found out the “Caffe2 & ONNX implementations differ.” But, I don’t know how that relates to Pytorch.
Thanks in advance,
Victor
escorciav
(Victor Escorcia)
2
I just found out about onnxconverter-common.
Likewise, opened an issue as part of onnx-optimizer.
escorciav
(Victor Escorcia)
3
We ended up traversing the onnx graph in Python & replacing the ops. Leverage Netron while doing so 