Failed to export an ONNX attribute 'onnx::Gather', since it's not constant, please try to make things (e.g., kernel size) static if possible

Trying to convert this pytorch model with ONNX gives me this error. I’ve searched github and this error came up before in version 1.1.0 but was apparently rectified. Now I’m on torch 1.4.0. (python 3.6.9) and I see this error.

File “/usr/local/lib/python3.6/dist-packages/torch/onnx/ init .py”, line 148, in export
strip_doc_string, dynamic_axes, keep_initializers_as_inputs)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py”, line 66, in export
dynamic_axes=dynamic_axes, keep_initializers_as_inputs=keep_initializers_as_inputs)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py”, line 416, in _export
fixed_batch_size=fixed_batch_size)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py”, line 296, in _model_to_graph
fixed_batch_size=fixed_batch_size, params_dict=params_dict)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py”, line 135, in _optimize_graph
graph = torch._C._jit_pass_onnx(graph, operator_export_type)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/ init .py”, line 179, in _run_symbolic_function
return utils._run_symbolic_function(*args, **kwargs)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/utils.py”, line 657, in _run_symbolic_function
return op_fn(g, *inputs, **attrs)
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/symbolic_helper.py”, line 128, in wrapper
args = [_parse_arg(arg, arg_desc) for arg, arg_desc in zip(args, arg_descriptors)]
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/symbolic_helper.py”, line 128, in
args = [_parse_arg(arg, arg_desc) for arg, arg_desc in zip(args, arg_descriptors)]
File “/usr/local/lib/python3.6/dist-packages/torch/onnx/symbolic_helper.py”, line 81, in _parse_arg
"', since it’s not constant, please try to make "
RuntimeError: Failed to export an ONNX attribute ‘onnx::Gather’, since it’s not constant, please try to make things (e.g., kernel size) static if possible

How to fix it? I’ve also tried latest nightly build, same error comes up.

My code:

from model import BiSeNet
import torch.onnx
import torch

net = BiSeNet(19)
net.cuda()
net.load_state_dict(torch.load('/content/drive/My Drive/Collab/fp/res/cp/79999_iter.pth'))
net.eval()

dummy = torch.rand(1,3,512,512).cuda()
torch.onnx.export(net, dummy, "Model.onnx", input_names=["image"], output_names=["output"])
`

I also tried downgrading to torch 1.0.0 and the error is still there “RuntimeError: ONNX symbolic expected a constant value in the trace”

I added print (v.node ()) to symbolic_helper.py just before the runtime error is raised to see what’s causing the error.

This is the output: %595 : Long() = onnx::Gather[axis=0](%592, %594) # /content/drive/My Drive/Collab/fp/model.py:111:0

And that line in 111 in model.py is: avg = F.avg_pool2d(feat32, feat32.size()[2:])

Based on my further research I found this source stating that:

Both resNet50 and 32 are fine, but to resNet18, ONNX model cannot be exported.

The model I’m using is indeed based on resnet18.

The source suggests the following changes:

From this:

import torch.nn.functional as F
def forward(self, x):
feat = self.base(x)
feat = F.avg_pool2d(feat, feat.size()[2:])

To this:

class Model(nn.Module):
def init():

self.avg_pool2d = nn.AvgPool2d(kernel_size=k_s, ceil_mode=False)

def forward(self, x):

feat = self.avg_pool2d(feat, feat.size()[2:])

However after this change I have another error:

forward() takes 2 positional arguments but 3 were given

2 Likes

I got a very similar error while trying to export my resnet model. The only difference is that the problem is ‘onnx::Sub’ instead of ‘onnx::Gather’ as in your case. I have not found the solution yet. I also have pytorch 1.4.0.
Let me know if you have the solution.

2 Likes

@malemo @Mate_Nagy have you guys figure out the solution.
Any suggestion?
Thanks.

same problem,but %147 : Long() = onnx::Div(%145, %146),
can’t traceback where the error occur, someone can help?
pytorch 1.3.0

The same as @Mate_Nagy ‘onnx::Sub’ any ideas?