Duplicate results due to lack of entropy using multinomial without replacement?

Hi Jeff!

I can confirm your result, and I do believe that it is a bug.

This isn’t the cause (but I don’t know what the cause is).

torch.multinomial() uses by default pytorch’s “global” random-number
generator which, on the cpu, is (from memory) a mersenne twister and
has a period that is vastly greater than 2**32.

I don’t this that this is true. I speculate (but have not confirmed) that
multinomial() will generate all permutations with non-zero probability,
but does so sufficiently non-uniformly that duplicates are generated
much too frequently.

I don’t think that you are missing anything. I see the same thing
with a similar script.

I will start a new thread with the details to highlight this bug you’ve
found.

[Edit: This is the new thread.]

Best.

K. Frank