Call Cython Module in Torchscript

Hi, guys. I wonder how to call scripts written in cython when creating jit scripts?

Traditionally, I can call cython class directly in python file after compiling those cython modules. And part of my model inference requires those cython scripts for acceleration. I wonder if it is possible to call the functions and classes written in cython when creating jit.script for my model?

I tried directly calling them (just as how I did in the original python script), but it says that there’s missing a .py file (I thought this is as expected because all cython modules are ended with xxx.pyx and xxx.pyd, and they are compiled to xxx.cytree.cpython-38-x86_64-linux-gnu.so on my machine):

Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/site-packages/torch/_utils_internal.py", line 49, in get_source_lines_and_file
    sourcelines, file_lineno = inspect.getsourcelines(obj)
  File "/opt/conda/lib/python3.8/inspect.py", line 967, in getsourcelines
    lines, lnum = findsource(object)
  File "/opt/conda/lib/python3.8/inspect.py", line 790, in findsource
    raise OSError('source code not available')
OSError: source code not available

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "new_convert.py", line 410, in <module>
    tmp=torch.jit.script(LSTMNet(hanabi_config,inverse_transform))
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_script.py", line 897, in script
    return torch.jit._recursive.create_script_module(
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_recursive.py", line 352, in create_script_module
    return create_script_module_impl(nn_module, concrete_type, stubs_fn)
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_recursive.py", line 410, in create_script_module_impl
    create_methods_and_properties_from_stubs(concrete_type, method_stubs, property_stubs)
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/_recursive.py", line 304, in create_methods_and_properties_from_stubs
    concrete_type._create_methods_and_properties(property_defs, property_rcbs, method_defs, method_rcbs, method_defaults)
  File "/opt/conda/lib/python3.8/site-packages/torch/jit/annotations.py", line 76, in get_signature
    source = dedent(''.join(get_source_lines_and_file(fn)[0]))
  File "/opt/conda/lib/python3.8/site-packages/torch/_utils_internal.py", line 56, in get_source_lines_and_file
    raise OSError(msg) from e
OSError: Can't get source for <class 'cytree.Roots'>. TorchScript requires source access in order to carry out compilation, make sure original .py files are available.

I wonder how do you guys integrate cython modules to a jit script? Is re-write the cython module to the traditional python script the only solution?

Thanks!

You can add @jit.ignore decorated forwarders.

Alternatively, C++ extension ops work with JIT, even supporting gradients, but I don’t know how exactly to register them from python. In C++ it is like:

static auto registry =
torch::RegisterOperators()
.op("namespace::func", &func)

with that you can call torch.ops.namespace.func(…)

Hi Alex. Thanks for the reply. jit.ignore seems to disable the function being decorated. Is there any alternatives if I want to keep the decorated function as part of the inference part?

In my case, there are two codebases (1. python & cython 2. c++ & torchscript).
On the one hand, the forward pass of the model contains two steps. First decoding the input to features, and then conduct search algorithms (e.g., MCTS) to give the output. And to accelerate the search process, it is written in cython. The code logics here might be quite complex and may not be encapsulated in an operator.
On the other hand, a codebase loaded the above model and carry out the inference process as follows.

torch::jit::script::Module model_;
model_=model_(torch::jit::load(path, torch::Device(device)))
std::vector<torch::jit::IValue> jitInput;
//... push input to the vector
auto jitOutput = model_.forward(jitInput);;

Check Extending TorchScript with Custom C++ Classes — PyTorch Tutorials 1.10.1+cu102 documentation. If I recall correctly, cython can integrate c++ modules, so perhaps a hybrid module (adapter / facade to cython parts) would do what you want.

Thanks, Alex! I will check that and post my updates here.

@Jacky_Wang Hello Jacky! I had met the same problem… But I have no idea. Can you share some updates?