Creating one hot vector from indices given as a tensor

I have a tensor of size 16 x 28 where 16 is batch size and 28 is sentence length. Every element of the sentence vectors are some index (0 to n). I want to create a 16 x 28 x n tensor where the vectors in 3rd dimension will be one hot encoding of the index which means I want to put 1 in the specified index and rest of the values will be zero. How can I do that using pytorch functionalities?

Right now, I am doing this with loop but I want to avoid looping!

If the one-hot vectors are used for retrieving word embeddings, just use Embedding layer instead.

No, this is not related to word embeddings at all.

You can use scatter for this purpose.

For example:

# your tensor of 16 x 28 dimensions, 
# where each element has some index (0 to n)
inp = torch.LongTensor(16, 28) % n    
inp_ = torch.unsqueeze(inp, 2)

one_hot = torch.FloatTensor(16, 28, n).zero_()
one_hot.scatter_(2, inp_, 1)

print(inp)
print(one_hot)
7 Likes