# Generate one hot encoding, from probability matrix fast

I asked this question on stack overflow with numpy in mind, but I’m curious to see whether pytorch have a good solution for this, as I’m intending to use it in pytorch. (Stackoverflow question here: https://stackoverflow.com/questions/63948265/generate-random-vectors-with-a-given-numerical-distribution-matrix )

I’m trying to come up with a fast and smart way of generating random one-hot vectors from a distribution matrix.

To give a quick example of what I want to do: Given a one-probability matrix of shape (n,3):

``````p = torch.tensor([[0.2, 0.4, 0.4],[0.1, 0.7, 0.2],[0.44, 0.5, 0.06],...])
``````

I wish to draw n elements, where each element can be either 0,1,2 and gets selected with the probability given in the above probability matrix.

Is there a fast way of doing this that doesn’t involve for loops, as I want to do this as data augmentation and therefore needs to be fast.

Hi,

You can try

``````_, lbls = torch.max(p, 1)
``````

Thanks

``````torch.zeros_like(p).scatter_(1, torch.multinomial(p,1), 1.)
``````