Is F.gumbel_softmax correct?

In pytorch/ at b4ed13ea0ff091328c6d0dfdf5d751d4280fb67f · pytorch/pytorch · GitHub you can see that, in F.gumbel_softmax, samples from Gumbel(0, 1) are drawn in the following way:

    gumbels = (
        -torch.empty_like(logits, memory_format=torch.legacy_contiguous_format).exponential_().log()
    )  # ~Gumbel(0,1)

Is this correct? torch.empty_like returns an uninitialised tensor, which is not initialized in any way here.

Yes, it’s correct as you are right that torch.empty_like will create a tensor with uninitialized memory, but the following .exponential_() operation would fill the values from an exponential distribution as described in the docs.

1 Like

I see, thank you! I thought .exponential_() was f(x) = e^x!

Yeah, I thought so but this is done by tensor.exp().