Quantization aware training for parameterized model

I have a parameterized linear layer in my model and would like to do quantization aware training with the model. However, if I prepare my model for quantization aware training I get the following error: “RuntimeError: Serialization of parametrized models is only supported through state_dict().”. It also provides me with a link to a guide about saving and loading models, which does not help me.
How can I do quantization aware training with a parameterized layer?