While trying to export the trained agent network in onnx format using inbuilt tracing function torch.export, facing the following error.
import torch.onnx
from ddpg_agent import Agent
from ACVEnv import *
import os
attacker_agent = Agent(agent_name = 'attacker',alpha=0.0001, beta=0.001,
input_dims=a_env.observation_space('attacker').shape, tau=0.001,
batch_size=64, fc1_dims=400, fc2_dims=300,
n_actions=a_env.action_space('attacker').shape[0])
detector_agent = Agent(agent_name='detector' ,
alpha=0.0001, beta=0.001,
input_dims=a_env.observation_space('detector').shape, tau=0.001,
batch_size=64, fc1_dims=400, fc2_dims=300,
n_actions=a_env.action_space('detector').shape[0])
# attacker_agent(a_env).load_models()
attacker_agent.load_models()
detector_agent.load_models()
# attacker_agent(a_env).eval()
attacker_agent.eval()
detector_agent.eval()
a_env.reset()
observations = {'attacker':a_env._observation_spaces["attacker"].sample(), 'detector':a_env._observation_spaces["detector"].sample()}
agent_names = ["attacker", "detector"]
agents = {'attacker':attacker_agent, 'detector':detector_agent}
for agent in agent_names:
file_path = os.path.join('tmp/onnx', agent+'_exported.onnx')
print(torch.flatten(torch.tensor(observations[agent])))
torch.onnx.export(agents[agent], tuple(observations[agent]), file_path, verbose=True)
The corresponding issue is closed but found no solid solution!
opened 04:51PM - 11 Nov 19 UTC
closed 10:18PM - 17 May 21 UTC
oncall: jit
triaged
## 🐛 Bug
I am receiving this error on pytorch model export to onnx:
`Runtime… Error: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type NoneType`
The error says that tuples are supported but got unsupported type NoneType. I checked the type of the variable `out` and it is a tuple at the error `out_vars, _ = _flatten(out)`
## Code To Reproduce (download [save_entire_model](https://www.dropbox.com/s/qkeeuafeij1vhdu/save_entire_model?dl=0))
```
import onnx
import torch
model = torch.load('save_entire_model')
text_input = torch.randint(low=0, high=5000, size=(10, 10))
input_names = ['inputs']
output_names = ['predictions']
dummy_inputs = (text_input, text_input, text_input, text_input)
torch.onnx.export(model, args=dummy_inputs, input_names=input_names,
output_names=output_names, f='some_file')
```
Stack trace:
```
Traceback (most recent call last):
File "pytorch_to_tf_serving.py", line 192, in <module>
main(args)
File "pytorch_to_tf_serving.py", line 171, in main
num_inputs=len(dummy_inputs))
File "pytorch_to_tf_serving.py", line 59, in export_onnx
output_names=output_names, f=file)
File "/usr/local/lib/python3.7/site-packages/torch/onnx/__init__.py", line 143, in export
strip_doc_string, dynamic_axes, keep_initializers_as_inputs)
File "/usr/local/lib/python3.7/site-packages/torch/onnx/utils.py", line 66, in export
dynamic_axes=dynamic_axes, keep_initializers_as_inputs=keep_initializers_as_inputs)
File "/usr/local/lib/python3.7/site-packages/torch/onnx/utils.py", line 382, in _export
fixed_batch_size=fixed_batch_size)
File "/usr/local/lib/python3.7/site-packages/torch/onnx/utils.py", line 249, in _model_to_graph
graph, torch_out = _trace_and_get_graph_from_model(model, args, training)
File "/usr/local/lib/python3.7/site-packages/torch/onnx/utils.py", line 206, in _trace_and_get_graph_from_model
trace, torch_out, inputs_states = torch.jit.get_trace_graph(model, args, _force_outplace=True, _return_inputs_states=True)
File "/usr/local/lib/python3.7/site-packages/torch/jit/__init__.py", line 275, in get_trace_graph
return LegacyTracedModule(f, _force_outplace, return_inputs, _return_inputs_states)(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/torch/nn/modules/module.py", line 541, in __call__
result = self.forward(*input, **kwargs)
File "/usr/local/lib/python3.7/site-packages/torch/jit/__init__.py", line 355, in forward
out_vars, _ = _flatten(out)
RuntimeError: Only tuples, lists and Variables supported as JIT inputs/outputs. Dictionaries and strings are also accepted but their usage is not recommended. But got unsupported type NoneType
```
## Environment
Torch (tried both version: 1.3.0 and 1.3.1)
Onnx Version: 1.6.0
Python 3.7
cc @suo
def forward(self, observation):
self.actor.eval()
state = T.tensor([observation], dtype = T.float).to(self.actor.device)
mu = self.actor.forward(state).to(self.actor.device)
mu_prime = mu + T.tensor(self.noise(),
dtype=T.float).to(self.actor.device)
self.actor.train()
return mu_prime
…also adding the forward function defined for the ddpg agrnt if that helps.
Based on the error message it seems your model returns a numpy.float32
, which is not supported. Convert it to a tensor and this error might be solved.
No, we tried giving observation inputs in tuple/tensor formats, but a similar message appears. The last error stack trace shows it is thrown from somewhere inside jit. We have also raised an issue regarding the same here which contains the complete error log for your reference.