Load mutiple models at the same time

I’d like to load two models in one inference script to test two datasets. The two models are both instantiated from the class Darknet (inherited from nn.Module).
When I debug the script in pycharm, I find the two models seem to share the same memory space. It’s a little strange though, so I’m not surely whether I’m doing it in a right way.
Could somebody help me please?

Here is the code:

    if model1 is None:
        device = torch_utils.select_device()

        # Initialize model
        model1 = Darknet(cfg, img_size).to(device)

        _ = load_darknet_weights(model1, weights1)

    if model2 is None:
        # Initialize model
        model2 = Darknet(cfg, img_size).to(device)

        _ = load_darknet_weights(model2, weights2)

You can control where you load your models with .to(device) like you’re already doing. If you only have one GPU or if you specifically select the same device, both of your models will be placed into that device’s memory (as long as they both fit). You can load as many models onto a single device as you wish and they will happily coexist until you’ve filled up the memory on the device.

Thanks for your reply!
So it won’t make the models conflict with each other even if both work on the same GPU, right? If so, then you don’t need to reply, and later I’ll accept your answer.

They won’t be conflicting but since you run them on the same GPU they can’t be run in parallel exactly so you may see lower performance than if you ran them on separate GPUs.

Thank for your advise. Yeah, it’s kind of a problem thought, I have only one GPU:sweat_smile:So later if I get two, I’ll consider run them in different GPUs.