Libtorch with cloud solutions?

Is it possible to use libtorch with cloud solutions such as MS Azure, Google Cloud or others ?
I would hate having to use python for that :grin:

…or maybe I can just upload a set of data and my saved model and ask the remote servers to do the training ?

If you would like to train a model, the cloud providers should already provide containers with a reasonable setup to start your training.
In that case, I would recommend the Python frontend or why would you like to use libtorch for training?

In the case of inference, I have to pass, as I don’t know what the best approach currently is.

Because I have a long experience with C++, but none with Python, and my brief experience with Python and its packages everywhere makes me want to stay away from it as much as possible…

1 Like


I work on RL trainer (virtual robots locomotion) written on top of libtorch/C++.
My local machine’s power is not enough, I’m considering moving the training to a cloud but have no clue even how to approach it.
Anyone had experience launching console executable with extensive CUDA usage on some cloud server?