torch.nn.Embedding() with variable num_embeddings in a single NN model

Hi There,

In torch.nn.Embedding(num_embeddings, embedding_dim, …), the depth of the table (i.e. num_embeddings value) must be defined with a constant value in a given NN model. What if, in this model, num_embeddings itself is a variable? How to cope with it?

Any insight in this regard will be highly appreciated.


In that case it might be simpler to create the variable weight parameter and use the functional API. Once you want to increase this parameter, you could use e.g., create a the new weight parameter, and pass it to an optimizer so that it can be trained again.

Thank you very much.