Question about tensors dot product multiplication

Good morning,
I would like to represent a document by the sum of it weighted embedding .

I use:

  • batch_size of 128
  • documents that each contains 1000 words
  • embedding size of 300
    Which add up to 128x1000x300 tensor. We’ll mark as E.

*I use batch_size of 128

  • I have 1000 weights. That corresponding with the word document from the first tensor.
    Which add up to 128x1000 tensor. We’ll mark as W.

For every word in the document I want to multiply it by the corresponding weight. This means multiplying all 300 embedding components at the same weight.

I would be grateful if you could help me with that.

Thank you,
Ortal

I defined the 128x1000 as 128x1000X1 and that solve the problem.