Merging two or more models together

Is it possible to merge many models together, whilst maintaining the same overall model structure?

For example, I have three models trained on separate parts of the same dataset. I would like to then merge them into one model that maintains the same structure of the model (so I can use the same scripts etc.).

Is it possible to do so?
Would the model be worse than if I trained one single model?
Would this scale?

Many thanks in advance

I’m not sure, what “merge” means in this case and if you want to reduce the different parameter sets to a single one or if you want to create a model ensemble.
In the former case, you could use this approach, but note this concern. In the latter case you could use this approach.

This would be my concern, but you should run your experiments and see, if it could work for your use case. E.g. Stochastic Weight Averaging also works, but would average “similar” checkpoints, not completely different models, which might have converged to different minima.

@ptrblck thanks a lot for the fast response

I’m not sure, what “merge” means in this case

I simply want to add the trained data from the 2nd model into the first model.

Taking a look, it seems the former link could be solution to my question.