Manual arithmetic on embedding weights

I am looking for an efficient way of manual editing embedding weights, in between forwards-backprop-iterations.

Say I have three embeddings, example:

self.tok_embA = nn.Embedding(config.vocab_size, config.n_embd)
self.tok_embB = nn.Embedding(config.vocab_size, config.n_embd)
self.tok_embC = nn.Embedding(config.vocab_size, config.n_embd)

Then I train tok_embA as usual in the model, and perhaps tok_embB too. Using an idx array of tensor indexes.

But then I want to do manual arbitrary arithmetic on tok_embC, using the same tensor indexes, and the weights from the trained tok_embA and tok_embB. Like a simple loop through the current tensors for the current input prompt, but the arbitrary arithmetic comes with lots of conditional stuff so simple self.tok_embA.sub(self.tok_embB) etc over all tensors won’t help.

How is a preferable way to loop through the current idx embedded weigths, and perform manual arithmetic on them?

The easiest way would be to apply the loop and manipulate the parameter using all conditions.
It’s hard to tell if and how these operations might avoid the loop without seeing the condition, but you might be able to use e.g. a mask.

1 Like

Thanks! I actually had the mask option in mind, and now I am more confident in that being the way forward. Now I need to somehow convert my logic into masked matrix-operations … but that’s another story :smiley: