# What's prim::Constant() mean in onnx model transformed from pytorch?

I transformed the pytorch model to onnx :

``````  %79 : Tensor = onnx::Unsqueeze[axes=[0]](%75)
%80 : Tensor = onnx::Unsqueeze[axes=[0]](%77)
%81 : Tensor = onnx::Concat[axis=0](%78, %79, %80)
%82 : Float(2, 1, 256) = onnx::ConstantOfShape[value={0}](%81), scope: CRNN/Sequential[rnn]/BidirectionalLSTM[0]/LSTM[rnn]
%83 : Tensor? = prim::Constant(), scope: CRNN/Sequential[rnn]/BidirectionalLSTM[0]/LSTM[rnn]
``````

Does `%83:prim::Constant()` belong to onnx? I did not find that op in onnx.

1 Like

`Prim::Constant()` gives you a â€śtyped Noneâ€ť here, so in PyTorch terms, this is a `Optional[Tensor]` (aka `Tensor?`) that is `None` / not present.
One little known difference between Python and TorchScript â€“ at least I didnâ€™t know it until I submitted a PR that needed to be corrected to do the right thing â€“ is that while `None` has its own type in Python, it is always typed (of some optional type) in TorchScript.

(For those pedantic about internals, there is a `NoneType` defined in `jit_types.h`, but itâ€™s not for use â€śinsideâ€ť TorchScript and a None passed in to a scripted function is converted to the corresponding typed None.)

Best regards

Thomas

Thanks, but I canâ€™t convert the onnx model to Caffe2., how to fix it?

1 Like

Hi, Iâ€™m also curious about prim:Constant() in onnx, how did you solve this problem?

HiďĽŚdid you solve this problemďĽź