Is there any way to do multi-embedding fastly?

I have 100-dim category features and each value range is different. I want to map them to 16-dimensional vectors respectively:

embedding_layer_list = nn.ModuleList()
for i in range(100):
  embedding_layer_list.append(nn.Embedding(feature_range, 16))

# forward
for i in range(100):
  # forward each feature

Is there any way to do multi-embedding instead of for-loop?

Given the nature of embedding layers, why not just use:

embedding_layer=nn.Embedding(100*feature_range, 16)

At the end of the day, you are either calling 100 smaller tables or one bigger table. The result is the same.

Then just ensure your tokens are in the range of 0 to 100*feature_range - 1.

1 Like