Unexpected (for some) keys in load_state_dict

I try to load a saved model like this:

state = torch.load(args.ckpt)
model = GeneratorSeg(args.size, args.latent, args.n_mlp, image_mode=args.image_mode,
                         channel_multiplier=args.channel_multiplier, seg_dim=args.seg_dim
                         ).to(device)
new_state_dict = OrderedDict()
    for k, v in checkpoint['g_ema'].
        print(k) # I can see there are "convs.12.conv.weight", "convs.12.conv.blur.kernel" keys
        new_state_dict[k] = v

  model.load_state_dict(new_state_dict)
  model.eval()

But I get this error:
Unexpected key(s) in state_dict: “convs.12.conv.weight”, “convs.12.conv.blur.kernel”,
It is a bit confusing, when I print the keys, I can see that there are such keys in checkpoints.
Does anybody have any idea?

The key mismatch errors are raised e.g. if you’ve changed the model architecture or generally the attributes names of the parameters or buffers after saving the state_dict.
In your case the “convs.12.conv.weight”, “convs.12.conv.blur.kernel” are unexpected in the state_dict and cannot be mapped to the current model. I.e. if you check the parameter names of the model you should not be able to find them using these keys:

for name, param in model.named_parameters():
    print(name)

and they might have been moved to another attribute name etc.