Hi,
Assume that I have defined a C++ libtorch model that performs forward() with a custom struct as input. For example:
void forward(custom_d input){
return input.a + input.b
}
I want to save the model and use it in Python via torch.jit.load, is it possible to inference the pre-trained model since I don’t have this data structure (input) in my python program?