List of Embedding objects for Transformer

I have a model where each input from a list of inputs gets processed by its own Embedding and the results are concatenated. The underlying Embedding objects don’t have to be consistent with each other in any way.
I subclass my model from Transformer and store Embeddings in a list. But this list does not show up in transformer.named_parameters(), it doesn’t get transferred to CUDA when I transfer the transformer object to CUDA, it doesn’t really get used. Is there an alternative way to store these Embedding objects so that Transformer treats them correctly?

If I had a couple of them I could make each one a separate member of by subclass but in my case I need to store them in something like a list.

I guess you might have been using plain Python lists or dicts to store the embedding layers.
If that case, use nn.ModuleList/Dict instead, which will make sure to properly register these modules and push them to the desired devices via the to() operation on the parent model.

Thank you for the timely response. nn.ModuleList works perfectly here. If you have come across examples of nlp models with concatenated embeddings please let me know.