What is the quickest way to create some perturbations of the distributions already available in PyTorch distributions? For example, if I want to create a new distribution by doing $(1-\epsilon)N(\mu,\sigma^2) + \epsilon \delta_x$, what would be the quickest way to do it? My other goal is to perturb some pdf’s with Gaussian noise. If f(x) is a pdf of a probability density function, then I would like to construct a new density by perturbing it with some Gaussian noise by essentially the same equation as above $(1-\epsilon)f(x) + \epsilon N(\mu,\sigma^2)$.
Thanks very much.
Affine transformed gaussian is another gaussian distribution. Look at https://pytorch.org/docs/master/distributions.html#torch.distributions.transforms.AffineTransform and other bijective transforms there.
Generally, you can’t mess with pdfs, as they have to integrate to 1. Valid approaches to construct pdfs include mixture distributions, compound distributions, normalizing flows (advanced version of bijective transforms).
Thank you for your answer. What I was writing is basically creating various mixture distributions. I still can not figure out how to create mixture distributions except for the code of the mixture in the same family in pytorch.
Well, mixture’s log_prob() is not hard to write (logsumexp() can be handy). But there is not much else to put into distribution objects, they’d just contain Categorical and component distribution objects. And differentiable sampling won’t work because of Categorical.