Hello
I have trouble with creating appropriate types for my inference in C++.
I have custom class like
class CustomClass:
def __init__(self, tensor1: torch.Tensor, tensor2: torch.Tensor):
self.tensor1 = tensor1
self.tensor2 = tensor2
def print(self):
print(self.tensor1)
print(self.tensor2)
tensor3 = self.tensor1 + self.tensor2
print(tensor3)
And i want to take object of this class as a parameter in forward function of my Model
class Model(nn.Module):
def __init__(self):
super(Model, self).__init__()
def forward(self, custom_class: CustomClass):
custom_class.print()
but after i
jitted = torch.jit.script(model)
jitted.save("model.pt")
and try to load it and run in C++
torch::jit::script::Module model;
try {
model = torch::jit::load(MODEL, torch::kCUDA);
model.eval();
} catch (const c10::Error& e) {
fmt::print("{}\n", e.what());
}
std::vector<torch::jit::IValue> inputs_model;
std::vector<torch::Tensor> tensors;
inputs_model.push_back(tensors);
auto warmup_output = model.forward(inputs_model);
Where MODEL
is target_compile_definitions(${TARGET} PUBLIC MODEL="/model.pt")
in CMake i get of course type exception like
C++ exception with description "forward() Expected a value of type '__torch__.CustomClass (of Python compilation unit at: 0xb80611a0)' for argument 'custom_class' but instead found type 'List[Tensor]'.
I tried already exporting to C++ some kind of function to create this CustomClass
object in C++.
def create_custom_model(tensor1: torch.Tensor, tensor2: torch.Tensor):
return CustomClass(tensor1, tensor2)
jitted = torch.jit.script(create_custom_model)
jitted.save("custom_class.pt")
But then there is an difference in Python Compilation unit
like
C++ exception with description "forward() Expected a value of type '__torch__.CustomClass (of Python compilation unit at: 0xb9f10640)' for argument 'custom_class' but instead found type '__torch__.CustomClass (of Python compilation unit at: 0xb8345e80)'.
My best workaround was to create model taking tensors in forward like
class TensorModel(nn.Module):
def __init__(self):
super(TensorModel, self).__init__()
self.model = Model()
def forward(self, tensor1: torch.Tensor, tensor2: torch.Tensor):
custom_class = CustomClass(tensor1, tensor2)
self.model(custom_class)
and then passing it like
torch::jit::script::Module model;
try {
model = torch::jit::load(MODEL, torch::kCUDA);
model.eval();
} catch (const c10::Error& e) {
fmt::print("{}\n", e.what());
}
std::vector<torch::jit::IValue> inputs_model;
auto tensor1 = torch::ones({3, 3}, options);
inputs_model.emplace_back(tensor1);
auto tensor2 = torch::ones({3, 3}, options);
inputs_model.emplace_back(tensor2);
auto warmup_output = model.forward(inputs_model);
where options
is like auto options = torch::TensorOptions().device(torch::kCUDA);
But is there a way to create This __torch__.CustomClass
object in C++ and pass to forward method in more C++ way?
Iām trying on Python 3.8, torch 1.10.1 and libtorch-cxx11-abi-shared-with-deps-1.10.1+cu113 with G++ 9.3 on Linux working with CLion/PyCharm