Can i convert torch script module to nn module

Hi,
i have a problem when converting torch script module in to ONNX module,
but it’s works just fine on “normal” module,

i open a bug request about that at https://github.com/pytorch/pytorch/issues/30512,

but if i could convert the torch script module to nn module it’s will solve my issue,

Thanks,
Liron

code example:(you can ignore the test_normal )

    def _test_normal(self, num_classes, dummy_input):
        model = torchvision.models.resnet18(num_classes=num_classes)
        model_state_fixed = {}
        for k, v in self._model_state.items():
            k_fixed = k[3:len(k)]
            model_state_fixed[k_fixed] = v
        model.load_state_dict(model_state_fixed)
        torch.onnx.export(model, dummy_input, "/app_data/test_torch_script/torch_script_test_normal.onnx")

    def convert(self):
        loaded = torch.jit.load(self._torch_script_path)
        # loaded.load_state_dict(self._model_state)
        dummy_input = torch.randn(1, 3, 224, 224)
        target = loaded(dummy_input)
        self._test_normal(num_classes=len(target[0]), dummy_input=dummy_input)
        torch.onnx.export(loaded, dummy_input, self._out_onnx_path, verbose=True,
                          operator_export_type=torch.onnx.OperatorExportTypes.ONNX,
                          example_outputs=target)

There isn’t a way to extract an nn.Module from a compiled ScriptModule. Is it possible for you to instead export your original module instead of a ScriptModule?

For some background, torch.onnx.export will use torch.jit.trace to get an exportable graph from an nn.Module. The ONNX exporter does not support all the features of TorchScript (e.g. if you used torch.jit.script to compile your model, it may not be possible to export that compiled module to ONNX), but relying on torch.jit.trace enforces that only supported features are used.

thanks for your replay,
how can i exported model to torch script and than load it back and export to onnx?

to save as torch script i am using the following code:

model.eval()
dummy_input = torch.randn(1, 3, 224, 224)
traced = torch.jit.trace(model, dummy_input)
traced.save("/app_data/test_torch_script/torch_script_test.zip")

can i change the above code in order to use it’s output for exporting to onnx?

fixed the bug at https://github.com/pytorch/pytorch/issues/30512

1 Like