Hi,
I’ve written a model in Python, translated it to TorchScript with torch.jit.script(model), and serialized with torch.jit.save(). In a C++ program, I can load the model with torch::jit::load() and use the inference with model.forward().
Is there a way of accessing other (than forward) methods from the model in the C++ domain?
A direct model.my_method() call would require that the C++ compiler knows how to access this function. If the model is loaded during the execution, it may have functions of unknown name and the compiler (i.e., I) doesn’t know how to use them. It there a way to accomplish this, e.g., with something like model.call(‘my_method’, method_args)?