# Multivariate normal sampling function

Is there a torch function that takes in as input the mean tensor and the covariance tensor of a multivariate normal distribution and returns samples from the distribution?

I noticed that there is torch.normal but it takes per-element standard deviation and doesn’t take in correlation, I assume.

Hi @vvanirudh, I’m not sure whether you’re familiar with the phenomena, but when you train nets or statistical model which use multivariate normalized distributions, usually something called “factor rotation” occurs.

Basically, all the correlations will go to zero, so usually you don’t have to worry about modelling the full co-variance matrix - it simplifies things considerably - maybe this would be a good starting point?

1 Like

Hi @AjayTalati, I wasn’t planning to model the full covariance matrix in the way you mention. I am currently predicting the parameters of a 2D XY gaussian distribution (mean_x, mean_y, std_x, std_y and corr), from which I subsequently sample to get the input at the next time-step. For this I need to have access to a function that can sample from the full 2D gaussian distribution (like the np.random.multivariate_normal function, but a torch analog if one exists)

You can calculate the bivariate Gaussian yourself without the covariance matrix using the equation in
http://mathworld.wolfram.com/BivariateNormalDistribution.html

2 Likes

You should be able to sample construct a co-variance matrix, and sample using numpy,

https://docs.scipy.org/doc/numpy/reference/generated/numpy.random.multivariate_normal.html

then just use, `torch.from_numpy(samples)`

Hello @vvanirudh,

does this serve your needs (using https://en.wikipedia.org/wiki/Multivariate_normal_distribution#Drawing_values_from_the_distribution)?

``````import torch
import numpy
# take target covariance
cov = numpy.array(((1,0.5),(0.5,1)), dtype=numpy.float32)
# compute cholesky factor in numpy
l = torch.from_numpy(numpy.linalg.cholesky(cov))
# sample standard normal random and multiply
rnd = torch.mm(l,torch.randn(2,10000))
# check covariance in numpy
print (numpy.cov(rnd.numpy()))``````

Best regards

Thomas

1 Like

Hi @AjayTalati - would you mind expanding on “factor rotation” a bit? Would be great if you can point me to some material that discusses this issue.

I’m looking at multivariate mixture density networks and modelling sigma with just the lower triangular matrix elements.

Thanks.

1 Like

so wait, is there no way to do this directly with code provided by the developers of PyTorch?

Shouldn’t you be using:

http://pytorch.org/docs/master/distributions.html#torch.distributions.multivariate_normal.MultivariateNormal

`` torch.distributions.multivariate_normal.MultivariateNormal``
3 Likes

Thanks. I noticed that this is in master but was using 0.3.1 at the time. Did the work in tensorflow in the end, there’s already a rich set of APIs.