Hi. I’m practicing PyTorch FSDP coding these days, and trying to make save and load snapshots for it.
I ran the example code(torch/distributed/checkpoint/examples/fsdp_checkpoint_example.py) and it works well. However, every time I ran it, an warning message is always showed as
_dedup_tensors.py:44 INFO p:SpawnProcess-1 t:MainThread: Duplicate keys to remove:
dist_cp.save_state_dict( state_dict=state_dict, storage_writer=dist_cp.FileSystemWriter(CHECKPOINT_DIR), )
What does it means and how can I fix it?