Implementing register_grad_sampler for an unsupported layer

I’m running into a torchdp.dp_model_inspector.IncompatibleModuleException when trying to train a private version of the following models transformers/modeling_vit.py at cde0c750af2fae6848ed3ee8be381b4f1230ecd0 · anibadde/transformers · GitHub
and attempted to implement a register_grad_sampler for the nn.Parameter layer to resolve this issue. Are there docs on how to correctly implement this as even after implementing this I’m still running into the same error.