Is it safe to save HF transformer model using model._module.save_pretrained() private method of GradSampleModule?

I was wondering if it is safe to save the HF transformer models using private method model._module.save_pretrained() to save the model and load the model as it is usually done using the HF library since the wrapper GradSampleModule() around the model doesn’t have any method like save_pretrained

model, optimizer, dataloader = privacy_engine.make_private(
...    module=model,
...    optimizer=optimizer,
...    data_loader=dataloader,
...    noise_multiplier=1.0,
...    max_grad_norm=1.0,
... )

—Training—

model._module.save_pretrained(Path) 

and loading the model for inference using

from transformers import xyz

model = xyz.from_pretrained(Path)

I’m not sure if it’s safe becasue I’m getting very good accuracy and privacy with the model saved and loaded using the above snippet.