What's prim::Constant() mean in onnx model transformed from pytorch?

I transformed the pytorch model to onnx :

  %79 : Tensor = onnx::Unsqueeze[axes=[0]](%75)
  %80 : Tensor = onnx::Unsqueeze[axes=[0]](%77)
  %81 : Tensor = onnx::Concat[axis=0](%78, %79, %80)
  %82 : Float(2, 1, 256) = onnx::ConstantOfShape[value={0}](%81), scope: CRNN/Sequential[rnn]/BidirectionalLSTM[0]/LSTM[rnn]
  %83 : Tensor? = prim::Constant(), scope: CRNN/Sequential[rnn]/BidirectionalLSTM[0]/LSTM[rnn]

Does %83:prim::Constant() belong to onnx? I did not find that op in onnx.

1 Like

Prim::Constant() gives you a “typed None” here, so in PyTorch terms, this is a Optional[Tensor] (aka Tensor?) that is None / not present.
One little known difference between Python and TorchScript – at least I didn’t know it until I submitted a PR that needed to be corrected to do the right thing – is that while None has its own type in Python, it is always typed (of some optional type) in TorchScript.

(For those pedantic about internals, there is a NoneType defined in jit_types.h, but it’s not for use “inside” TorchScript and a None passed in to a scripted function is converted to the corresponding typed None.)

Best regards

Thomas

Thanks, but I can’t convert the onnx model to Caffe2., how to fix it?

1 Like

Hi, I’m also curious about prim:Constant() in onnx, how did you solve this problem?

Hi,did you solve this problem?