Why do we call "Pretrained model"

What I understood is that a

model is the result of a training algorithm.

Set of parameters learned by the training algorithm is model

Since a model is already a trained one,then why do we say “Pretrained models” .

If there are any misconception,please correct me.
If my understanding is wrong, please explain the difference between a model and algorithm.

A model is a group of algorithm which are stacked together to learn to solve complex problem.

Yes, a Pre-trained model is one which is already trained on some dataset.

Thanks for the response. I am having a confusion in the term usage. Model is the outcome of a training algorithm. So the term “model” itself means “pre trained”. Then why do we use “pretrained model”.

Yes, we do quote it like this in machine learning.
The term pre-trained model is originated from transfer learning, in which we re-use that model to train/test it on some other dataset to obtain results. See most of the cases where we quote pre-trained we actually use them to extract feature or embedding not the whole model itself.
I guess this makes some sense.

Thanks for the explanation. I still have doubt on the term, but anyways I am gonna use the term “pretrained model”. Resnet,vgg are architectures not models,right?

See, the literature is full of different terminologies, they are models which are sometimes called deep learning architectures too.

Okay. Thanks for your answers and patience.