PyTorch do not load weights if size mismatch

I can ignore missing keys when loading weights by setting strict=False. What about weights where there is a size mismatch? How can I ignore it i.e. not load it at all?

You could try to filter out the wrongly shaped parameters from the state_dict and try to load it using strict=False afterwards.

1 Like