I would like to ask a basic question.
I save my training models using
and load them using
My training data can get updated overtime, so to train a new model for the new data, I use a previous model as pretrained weights. The problem is if the new data has more (or fewer) classes, then I get an error:
size mismatch for fc.bias: copying a param of torch.Size() from checkpoint, where the shape is torch.Size() in current model.
My question is: How to load the pretrained model without the fully connected layer so that the above error does not occur?
Thank you very much for your help!