I want to have a system where the model authoring and training are separate steps. The model needs to be serialized to disk between those stages.
Keras has “model config” which describes the model architecture in a compact JSON-friendly way: https://www.tensorflow.org/api_docs/python/tf/keras/models/model_from_config
I want to have something similar for PyTorch. This problem is almost solved by
But I see that the result is pretty big and binary due to the fact that the weights are included.
Is there a way to not save parameters? Or to save them in un-initialized state so that they’re lazily initialized after loading?
TorchScript seems so powerful. Maybe there is a way to save the whole Module, not just the forward method + initialized constants?