No pretrained weights exist for this model, using random initialization

Hello,

I have a question regarding transfer learning. Every time that I try to load the model’s weight from the model that I have trained, I get: “No pretrained weights exist for this model, using random initialization”. I am wondering if I am doing the transfer learning properly.

Here is my class definition

class TIMMModels(pl.LightningModule):

    def __init__(self):
        super().__init__()
        self.model = timm.create_model('resnet34', pretrained=True, in_chans = 1, drop_rate=0.2, num_classes = 2)
        for param in self.model.parameters():
            param.requires_grad = False
        self.model.eval() #in case the model has some normalization layer, behave like in the evaluation mode

        num_in_features = self.model.get_classifier().in_features
        self.model.fc = torch.nn.Sequential(#by default gradient is required
        torch.nn.BatchNorm1d(num_in_features),
        torch.nn.Linear(in_features=num_in_features, out_features=256, bias=False),
        torch.nn.ReLU(),
        torch.nn.Linear(in_features=256, out_features=2, bias=False))

    def forward(self, x):
        x = self.model(x)
        return x

After the training, I try to load the weights

model_trained = TIMMModels.load_from_checkpoint(checkpoint_path).to(device)

and then I get: “No pretrained weights exist for this model, using random initialization”. I double checked the path, and the file was there. Do I need to save the checkpoint in a different way? Or am I doing something wrong regarding the transfer learning procedure?

Thank you.

I found the problem, the name of the model was wrong.