Using nn.Parameter in Opacus

Hey there, we are trying to train a private version of a particular model which uses nn.Parameter and getting the error torchdp.dp_model_inspector.IncompatibleModuleException.

In particular, the parameters are defined for the model here and used in the forward function here. To the best of our knowledge these parameters and associated operations preserve privacy because they don’t compute any aggregate batch statistics. What would be the recommended way about being able to train with this model definition.

Is there some sort of workaround we could do to wrap these lines in a valid module? Do we need to wait for the team to add an accepted module to opacus.SUPPORTED_LAYERS?

Hi Chris!

Luckily you don’t need to fork nor wait for the team to change that for you. You will however have the responsibility of writing a function to compute grad_sample for your layer. Once you write it, you simply register it for your layer using @register_grad_sampler and you should be good to go!

2 Likes

Hi! I’ve been running into a similar issue, @ChrisWaites did you manage to get this working?

Hi Chris, were you able to write a compute grad_sample for the nn.Parameter ? I was stuck on this problem as well