I am assuming that you want your result tensor to have the
same shape as your input tensor, [batch_size, 4], with
the largest value in each row replaced by 1 and the lesser
values replaced by 0.
How about (1.e6 * t).softmax (1)?
(No, no, no … Bad idea! Don’t listen to me!!!)
I don’t know of any good, one-step way of doing this. You have
to call argmax() (or do something equivalent) and then “one-hot”
your result. The commonly-suggested way to “one-hot” (that as
far as I know is the best way) is to use scatter().
t = torch.randn (8, 4)
a = t.argmax (1)
m = torch.zeros (t.shape).scatter (1, a.unsqueeze (1), 1.0)
print ('\n', t, '\n\n', a, '\n\n', m)
I’m not sure if the F.one_hot answer is extendable to the n-dimensional case, I’m also not sure how to do it with reshape operations. @ptrblck
Assume I have a tensor where the first two dims are batch and channel, and the last three correspond to xyz space:
A = torch.randn(b,c,32,32,32)
What I would like to do is to binarize along the x dimension (dim=2) for any batch or channel, i.e for every yz location I want to set the maximum value along x-axis to 1 and the rest to zero. Is there a way of doing this?