Torch<1.0 to torch torch>1.0 conversion

First post :smiley:
Hi, I’d like to convert these torch<1.0 pth.tar weight to torch>1.0 pth.tar weight. Can anyone point me in the right direction to do this easily? Thanks!

Update:
So I managed to change the keys of the state_dict by doing:

checkpoint = torch.load(checkpoint_pth)
pattern = re.compile(r'^(.*denselayer\d+\.(?:norm|relu|conv))\.((?:[12])\.(?:weight|bias|running_mean|running_var))$')
state_dict = checkpoint['state_dict']
for key in list(state_dict.keys()):
    res = pattern.match(key)
    if res:
        new_key = res.group(1) + res.group(2)
        state_dict[new_key] = state_dict[key]
        del state_dict[key]
return checkpoint['state_dict']

But when I compare pre-trained densenet121 model (torch<1.0, see link previous message) with a torch.models.densenet121 (torch>1.0) it looks like the torch<1.0 model contains extra layers such as
‘module.densenet121.features.denseblock2.denselayer12.norm2.num_batches_tracked’ and ‘module.densenet121.features.transition2.norm.num_batches_tracked’.

The mismatched layers are all of “.num_batches_tracked” any idea on how to solve this?

Hi,

This was added fairly recently to batchnorm. As you can see in the code here, the batchnorm layer should be able to load this properly even though the key is missing.