Loading model and weight each time in the for loop?

all_prediction = np.zeros((len(test_dataset), num_classes))

for i, weight in enumerate(weight_list):

for _ in range(tta):
    model = ...

    model.load_state_dict(torch.load(weights_path / weight))

    prediction = np.zeros((len(test_dataset), num_classes)) # num_classes=196
    with torch.no_grad():
        for i, images in enumerate(test_loader):
            images = images.cuda()

            preds = model(images).detach()
            prediction[i * batch_size: (i+1) * batch_size] = preds.cpu().numpy()
            all_prediction = all_prediction + prediction

all_prediction /= total_num_models

as you can see in my code. I’m calling model and weight in the for loop.
Do I have to call model everytime I load my weight? or could I just load model once and override weight?

No, you dont have to load the weights each iteration. You normally load it once at the start of your program.

but as you can see in my code. I have different weight lists trained with different folds. meaning that I will get different predictions for each weights.


If your question is should you call the model constructor every time to create a new object in the model variable to load the weight into, then no you dont have to. You can just load the weight each time but create the model object at start once, IMO.