Hi everyone, just first wanted to say that I really enjoy working with libtorch.
Is there a way to compile my C++ code with the model in such a way that when the executable is run, it doesn’t need to load the model from the file, but rather has the model “hard coded” into the program itself? That way I could run the executable with no external file dependencies.
Is this something that could be done with torchscript?
I ask because that would be really awesome for deployment purposes where I try to make my executable as light and as fast as possible. If that’s not the case, I might have to set it up as a daemon communicating on local sockets.
Edit: to be really clear, my end goal is to incorporate this into C code where it doesn’t need to load the file each time the function is called since the file loading is quite the bottleneck.
This is very likely a bad idea but can’t you bundle the file content into a static string when you compile and load from that string?
@albanD Haha! That was my first idea as well and I was hoping there was something a little less hacky. I may try it out and report back.
It would be really cool if I could compile the model itself down into a binary without such shenanigans though. I’m just not sure how it would be done.
@albanD it is indeed a bad idea! The reasoning is actually interesting though. I thought (for some reason) that the loading the model from the file was the bottleneck, but the bottleneck is actually libtorch reading the model settings and creating the algorithm itself. Because of that there is no speed boost from just hard coding the string into the file.
That being said, since the libtorch library essentially “compiles” the model, it would be cool if there were a functionality to save the model as an executable. Unfortunately I don’t have the time to hack on something like that currently. But maybe in the future if I’m looking for a project I’ll work on that.
As for right now I’m just setting my model up as a daemon communicating with other processes through local sockets and seems adequate.