Training a torch::jit::script::Module

There is an example for training a torch::nn::Module https://pytorch.org/cppdocs/frontend.html#end-to-end-example but when I try to use a torch::jit::script::Module, I cannot call .parameters() like in the example.

Is there a way to train a torch::jit::script::Module in C++?

1 Like

Weirdly unlike in Python, jit::script::Module doesn’t have the same API as nn::Module.

You should be able to access a script::Module's parameters via something like:

for (auto param_slot : my_module.get_parameters()) {
    auto my_tensor = param_slot.value().toTensor();
}

The nn::Module::parameters() returns a thing called a slot_list_impl<>.

Do you have advice on how I can make one of these from the tensors returned by the script::Module::get_parameters()?

slot_list_impl<NameValue> is an iterator over the one of a script::Module's internal lists of Parameters/Attributes. Can you elaborate on your use case for constructing one yourself? You should be able to mess with a script::Module by using script::Module::register_parameter to change any parameter value.

I was trying to adapt the training loop to work for torch::jit::script::Module so I needed the same interface that the optimizer expects. However, I realized that it accepts std::vector<at::Tensor>, so I actually don’t have to create slot_list_impl after all.

Maybe you can take a look at what I was trying to do here jit::script::Module parameters are not updating when training

I think for the code to actually work, I need to use register_parameter or set_parameter to push the updated tensor back into the script::Module after the optimizer has acted on it. Does that sound right? Or could there be a way that the optimizer can update the tensor values in place?