I am looking for an efficient way of manual editing embedding weights, in between forwards-backprop-iterations.
Say I have three embeddings, example:
self.tok_embA = nn.Embedding(config.vocab_size, config.n_embd)
self.tok_embB = nn.Embedding(config.vocab_size, config.n_embd)
self.tok_embC = nn.Embedding(config.vocab_size, config.n_embd)
Then I train tok_embA as usual in the model, and perhaps tok_embB too. Using an idx array of tensor indexes.
But then I want to do manual arbitrary arithmetic on tok_embC, using the same tensor indexes, and the weights from the trained tok_embA and tok_embB. Like a simple loop through the current tensors for the current input prompt, but the arbitrary arithmetic comes with lots of conditional stuff so simple self.tok_embA.sub(self.tok_embB) etc over all tensors won’t help.
How is a preferable way to loop through the current idx embedded weigths, and perform manual arithmetic on them?