Hello, I am using opacus for creating a DP - Hidden Markov Model. I was able to run the code using two methods of opacus, namely make_private() and make_private_with_epsilon(). I was able to print the epsilon values at each epoch of the training process using the function privacy_engine.get_epsilon(), and I observed that the rate at which epsilon increases over the training epochs is much higher in make_private() than in make_private_with_epsilon().
So I am trying to understand why that is the case and, broadly, what’s the difference between these two functions, and when should one use one function over the other.
My understanding is that the make_private_with_epsilon() function has parameters target_epsilon() and num_epochs and so I believe this function adds variable amount of noise at each step to achieve the target epsilon value. On the other hand, the make_private() function doesn’t have the parameter num_epochs, but has parameters noise_multiplier and max_grad_norm. In the documentation of this function noise_multiplier is defined as the ratio of the standard deviation to the l2 sensitivity of the gradients. I believe the l2 sensitivity of the gradients can change over the training epochs and hence I think the standard deviation of the noise added at each epoch can also change.
In summary, my understanding is that the amount of noise added in both of these methods could be different at each training epoch. However, one function produces a higher rate of change of epsilon over training epochs and I don’t understand why that is the case. Any thoughts? I appreciate your time.