Int?, Tensor?, types? Operator implementations

I am printing out the graph for a model and am seeing int? and Tensor? types. Here is the graph.

graph(%self : ClassType<Conv2D2>,
      %input.1 : Float(1, 3, 224, 224)):
  %1 : ClassType<Conv2d> = prim::GetAttr[name="conv"](%self)
  %weight : Tensor = prim::GetAttr[name="weight"](%1)
  %5 : Tensor? = prim::Constant(), scope: Conv2D2/Conv2d[conv]
  %6 : int = prim::Constant[value=1](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %7 : int = prim::Constant[value=1](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %8 : int[] = prim::ListConstruct(%6, %7), scope: Conv2D2/Conv2d[conv]
  %9 : int = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %10 : int = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %11 : int[] = prim::ListConstruct(%9, %10), scope: Conv2D2/Conv2d[conv]
  %12 : int = prim::Constant[value=1](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %13 : int = prim::Constant[value=1](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %14 : int[] = prim::ListConstruct(%12, %13), scope: Conv2D2/Conv2d[conv]
  %15 : bool = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %16 : int = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %17 : int = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %18 : int[] = prim::ListConstruct(%16, %17), scope: Conv2D2/Conv2d[conv]
  %19 : int = prim::Constant[value=1](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %20 : bool = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %21 : bool = prim::Constant[value=0](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %22 : bool = prim::Constant[value=1](), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %input : Float(1, 64, 218, 218) = aten::_convolution(%input.1, %weight, %5, %8, %11, %14, %15, %18, %19, %20, %21, %22), scope: Conv2D2/Conv2d[conv] # /usr/local/lib/python3.6/dist-packages/torch/nn/modules/conv.py:340:0
  %24 : int = prim::Constant[value=1](), scope: Conv2D2/Softmax[softmax] # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1230:0
  %25 : int? = prim::Constant(), scope: Conv2D2/Softmax[softmax]
  %26 : Float(1, 64, 218, 218) = aten::softmax(%input, %24, %25), scope: Conv2D2/Softmax[softmax] # /usr/local/lib/python3.6/dist-packages/torch/nn/functional.py:1230:0
  return (%26)

What are these empty constant operators w/ weird typings supposed to represent? Where can I find source code or documentation for these? Also off-shoot question, but where can I find the actual implementation for operators (https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/core/interned_strings.h)?

For sad legacy reasons we have two ways of presenting types, one that matches Python’s typing and our own internal one (which is what you’re seeing), even though the underlying types are the same. The ? here means optional, so int? is equivalent to Optional[int], and the empty prim::Constant() means None.

prim::Constant is a special case in the TorchScript interpreter. Other operators that don’t directly call the underlying torch operators live in register_prim_ops.cpp. However, most operators are generated at build time since they just call PyTorch tensor ops and are placed in torch/csrc/jit/generated.

1 Like

Ah, makes sense now. Thanks!