I can save/load the state_dict
of model and optimizer successfully in StateDictType.FULL_STATE_DICT
mode. However, I prefer the way to save checkpoint in LOCAL_STATE_DICT
mode, which can free me from unsharing the parameters when saving the state_dict
and sharding the parameters when loading the state_dict
. Is it possible to do this when I enable use_orig_params=True
?