How can I use multi-hot embedding efficiently?

Hello everyone,I encountered a problem in my code.

I want use embedding trick to encode my multi-hot vector ,such as [1,1,0,1,0,0,0,0],[1,0,0,0,0,1,0].
However, I find there is not multi-hot embedding in Pytorch.I writed a function to achieve multi-hot embedding,but it is used in for-loop,so it is so slow and time-consuming.
Can you help me solve this problem?Thank you!

1 Like