Torch.nn.functional.one_hot not working as expected


I was trying to convert a numpy array, as shown in the cell below, to a one-hot encoded vector. The shape of the input gt_arr is [N, H, W] and I wanted it to convert to [N,3,H,W] using the command below:

gt_arr_hot = nn.functional.one_hot(torch.from_numpy(gt_arr).to(torch.int64), num_classes=3).permute(0,3,1,2)

I get some weird output as show below:

np.unique(gt_arr) ---> array([0, 1, 2], dtype=uint8)

gt_arr_hot.unique() ---> tensor([ 0, 1, 281474976710656, 2251799813685248, 4503599627370496, 4503599627370497, 9007199254740992, 9007199254740993, 18014398509481984, 18014398509481985, 22517998136852480, 36028797018963968]

It doesn’t happen often but it happens randomly on some arrays. Am I doing something wrong here? Any help is much appreciated. Thank you in advance! and excuse me for my formatting skills.

Pytorch version 1.7.1

The output points towards an uninitialized tensor and looks like a bug.
I’m not able to reproduce it using this code snippet:

for _ in range(10000):
    N, H, W = 8, 24, 24
    gt_arr = np.random.randint(0, 3, (N, H, W))
    gt_arr_hot = nn.functional.one_hot(torch.from_numpy(gt_arr).to(torch.int64), num_classes=3).permute(0,3,1,2)
    if gt_arr_hot.unique().nelement() != 2:

Could you run it and check, if you are able to get these invalid outputs using it?

Even I’m unable to reproduce the above results now. I’m not sure what caused this to happen but it was a strange result. I’ll post it here if I find something more or if I’m able to consistently reproduce the bad results. Thank you @ptrblck.

1 Like

I encountered the same issue. pytorch 1.7.1…

I apologize for shooting this question here before thorough analysis. I ran memtest on my RAM and found out that several memory locations on one of my RAMs were corrupt and were returning wrong results during read operation, hence the unexpected output.