DataParallel changes parameter names, issue with load_state_dict()

You can just modify dic names by hand and manually removing module.

            state_dict = torch.load(directory, map_location=lambda storage, loc: storage)
        new_state_dict = OrderedDict()

        for k, v in state_dict.items():
            if k[:7] == 'module.':
                name = k[7:]  # remove `module.`
            else:
                name = k
            new_state_dict[name] = v