Operations on pytorch distributions?

I was wondering if it would be interesting to have operations like +, *, etc. on distribution from pytorch.distributions?

e.g.

X  = torch.distributions.Normal(loc=1, scale=1)
Y  = torch.distributions.Normal(loc=0, scale=2)
(X+Y).mean()
(X*Y).std()

of course one could sample to simulate the X, Y and then perform the operations on the equivalent samples, but I was wondering 1) if that is a good approximation and 2) would the above example be feasible?

Depending on what methods you need from the resulting distribution, and whether you operate on the same family, different methods are needed (analytical formulas, monte carlo approximations, fourier transforms). So it would be hard to create a generic RV algebra mechanism, and it is a bit outside of ML realm; for example R package ‘distr’ implements this algebra, but there it is used interactively or for one-off computations, not for deep learning.
You can do deep learning with sampling of course, and it is the usual way, but then the need for this arithmetic is unusual.