Saving pytorch model

I want to save my model in python and load it in C++.

According to LOADING A PYTORCH MODEL IN C++, I’m supposed to trace my model using torch.jit.trace() and then save that model.

However, I’d like to save my model as it is (a subclass of nn.Module), so that if I load it up in Python, it would have the same attributes (which contains meta information about the model, such as how many resblocks are used etc).

Is this possible? I’m asking because currently if I jit.trace() the model, all these informations are lost.