Hello,

I would like to ask a basic question.

I save my training models using

`torch.save(model.state_dict(), save_path)`

and load them using

`model.load_state_dict(torch.load(save_path))`

My training data can get updated overtime, so to train a new model for the new data, I use a previous model as pretrained weights. The problem is if the new data has more (or fewer) classes, then I get an error:

size mismatch for fc.bias: copying a param of torch.Size([45]) from checkpoint, where the shape is torch.Size([44]) in current model.

My question is: How to load the pretrained model without the fully connected layer so that the above error does not occur?

Thank you very much for your help!