Error when exporting model to onnx

The model is available on GitHub.It is for manipulation of multiple face attributes.
Here is the link

It generates four files
Enc-iter.pth
Dec-iter.pth

        D1-iter.pth

        D2-iter.pth

torch_out=torch.onnx.export(model,x,model.onnx, verbose=True)

Export function gives the graph but when I print torch_out it gives None but it should print tensor value

And this is causing AtrributError when comparing torch_out with caffe2 model

np.testing.assert_almost_equal(torch_out.data.cpu().numpy(), c2_out, decimal=3)

In this I am getting AtrributError
Nonetype object has no attribute ‘data’

I have asked it on GitHub/onnx but they are saying ask this on pytorch forum .issue is in pytorch code.

torch.onnx.export shouldn’t return anything, but store a binary protobuf file at the specified location.
This file can then be loaded using onnx.

Have a look at the doc for some example code.

thank you for the response

Verify the numerical correctness upto 3 decimal places

np.testing.assert_almost_equal(torch_out.data.cpu().numpy(), c2_out, decimal=3)

in this we are comparing two multidimensional arrays

for this i ran example of super resolution and here is the output

import io
import numpy as np

from torch import nn
import torch.utils.model_zoo as model_zoo
import torch.onnx
torch_model = SuperResolutionNet(upscale_factor=3)

Input to the model

x = torch.randn(batch_size, 1, 224, 224, requires_grad=True)

Export the model

torch_out = torch.onnx._export(torch_model, # model being run
x, # model input (or a tuple for multiple inputs)
“super_resolution.onnx”, # where to save the model (can be a file or file-like object)
export_params=True,verbose=True) # store the trained parameter weights inside the model file
print(“torch_out”,torch_out)

this is the output i.e. graph and to see what is the output of torch_out i.e. tensor value

torch_out tensor([[[[ 6.8571e-01, 9.3982e-02, 1.8855e-01, …, 1.1247e-01,
3.9000e-01, -3.4897e-01],
[ 8.2722e-02, -6.9161e-01, 1.0651e-01, …, 1.6725e-01,
8.2437e-02, -1.2075e-02],
[ 5.9089e-02, -1.4693e-01, -2.1743e-01, …, -1.1221e-01,
9.0878e-02, 2.8896e-01],
…,
[-9.9215e-02, 6.8443e-02, 3.1096e-01, …, 6.0085e-05,
6.3814e-02, 5.8048e-02],
[ 8.1212e-02, -8.7285e-02, 3.3845e-02, …, -4.4617e-03,
-7.5614e-02, 2.7088e-01],
[-1.1385e-01, 3.1267e-01, -3.3085e-01, …, -2.5969e-01,
5.3383e-01, -2.2986e-01]]]], grad_fn=)

and

import onnx
import onnx_caffe2.backend

Load the ONNX ModelProto object. model is a standard Python protobuf object

model = onnx.load(“super_resolution.onnx”)

prepare the caffe2 backend for executing the model this converts the ONNX model into a

Caffe2 NetDef that can execute it. Other ONNX backends, like one for CNTK will be

availiable soon.

prepared_backend = onnx_caffe2.backend.prepare(model)

run the model in Caffe2

Construct a map from input names to Tensor data.

The graph of the model itself contains inputs for all weight parameters, after the input image.

Since the weights are already embedded, we just need to pass the input image.

Set the first input.

W = {model.graph.input[0].name: x.data.numpy()}

Run the Caffe2 net:

c2_out = prepared_backend.run(W)[0]
print(“c2_out”,c2_out)

Verify the numerical correctness upto 3 decimal places

np.testing.assert_almost_equal(torch_out.data.cpu().numpy(), c2_out, decimal=3)

print(“Exported model has been executed on Caffe2 backend, and the result looks good!”)

here output of c2_out is tensor which is

c2_out [[[[ 0.19614297 -0.14641185 0.63868225 … -0.12567244 -0.09246139
0.1514894 ]
[ 0.2977414 0.30597484 0.07216635 … 0.3880918 0.31021857
-0.00942096]
[ 0.29691827 0.26158085 0.67230344 … 0.5879603 -0.22218955
-0.04444573]

[-0.40999678 -0.31105435 -0.9089918 … 0.965673 -0.32325625
0.47099867]
[-0.34081718 -0.12496098 -0.7071778 … -0.48262012 -0.28351355
0.305478 ]
[-0.10641275 -0.42513472 -0.41691086 … 0.01121937 0.36214358
0.11300252]]]]

the model which i am trying to export does not output tensor value
when i am trying to print torch_out it gives ouput as None

and that’s why it is giving attribute error

torch.onnx.export does not return anything, thus will yield an error, if you try to access the return value.
In your code snippet, you are using torch.onnx._export (note the underscore), which seems to be an internal method, and will yield an output.

I’m not sure, which method you are currently using.

1 Like

thank u for the response
I was using torch.onnx.export .And i suppose to use torch.onnx._export.
Silly mistake.
Thanks a ton