Model inference on cpu + multithreading

Good afternoon!
I would like to know the details of the inference of the model on the CPU. When using a multi-core processor, will the pre-trained models use only one processor core or will they automatically use all cores? If the processor is not used completely automatically, is it possible to link multithreading and inference models? Thanks for your advice!