Best way to implement series of weighted random sampling (for transition w/ stochastic matrix)?

Dear experts,

I am trying to find the best way to perform a series of weighted random sampling, each from a different distribution. I will try to elaborate on what I intend to do (Maybe if you just jump to the code it will be immediately clear).

I have an input x of length L. Each element is an integer between 0 and N.
I want to generate an output y of length L, where each element of y is an integer between 0 and M.

Each element y[i] is transitioned from the corresponding element x[i], with a certain transition probability distribution defined by a NxM stochastic matrix p.

p[i][j] denotes the probability of y becoming j, if x is i.

Basically, if N==M, I am trying to do a Markov chain transition.

Below is a (very slow) code I implemented.

import torch
from torch import nn

N = 10
M = 20
L = 5

p = torch.rand([N, M])

x = torch.randint(0, N, [L])
print(x)
y = torch.zeros([L])

for i in range(L):
    y[i] = list(torch.utils.data.WeightedRandomSampler(p[x[i]], 1))[0]
print(y)

Here, I am not normalizing p row-wise (i.e., it is not a stochastic matrix) because WeightedRandomSampler does not need it to be normalized row-wise.
The problem with this code is that I am using a for loop, and this becomes too slow for large L.
I wonder if there is a way to implement the same thing faster, without having the for loop.

Any insights would be highly appreciated.

Thank you.

Actually, I figured it out. There was a function called torch.multinomial(), which seems to be doing exactly what I wanted to do.