How to quantize a model by layer

By reading the tutorial by pytorch, it seems dynamic quantization only offers to quantize model by op type, correct me if i am wrong, and static quant requires modification to the floating model by adding quantStub and dequantstub…

Is there a way to simply just quantize a model one layer at a time? The approach I can think of is to use static quant, modify the model first, and instead of using a global model.config, just add model.layerx.config to the layers I want to quantize…

Is there any other way available without messing with the model since there are many models out there and different implementations…