Export libtorch trained model to load in python

I was wondering if anyone has any ideas for how to export to disk a trained model in libtorch, then load in python pytorch? The environment I work in benefits from training in the C++ runtime, but it is often nice to run visualizations on the python end for insights.

I’ve seen this here but there doesn’t seem to me much options: Exporting libtorch model and loading in Python

Ideally, the loading side on the python pytorch can be done with vanilla pytorch (no C++ extensions). Maybe a custom script which reads the weights from the exported file and sets the weights accordingly on the python end, assuming the network structure is the same and the weights are written in a deterministic way? Are there any documentation for how the weights are stored in the exported file? Any other ideas?

Here is what I currently have:

Assuming the model parameters are registered in the correct order (i.e. iterating over in C++ model->named_parameters() yields the same order as python model.state_dict(), in C++ I can save the weights as torch::save(model->named_parameters().values(), "test.pt");.

On the python side, I can do the following:

t = torch.jit.load("test.pt")
tensor_list = list(t.parameters())

param_names = [name for name in model.state_dict()]
model_dict = model.state_dict()

for tensor, param_name in zip(tensor_list, param_names):
    model_dict[param_name] = tensor
    
model.load_state_dict(model_dict)

From my limited testing, this seems to work correctly.