Thread safety between model.state_dict and optimizer.step()

Suppose I have the following two threads:

Thread one: Typical training loop that calls optimizer.step() to modify model

Thread two: “checkpointing loop” that calls model.state_dict()

Will thread two ever receive a partially modified model? Or is there some atomicity and/or synchronization under the hood to prevent this?