Calculate embedding from smoothened one-hot vector

I have a case where I get a smoothened one-hot(probability) distribution and I would like to do an average embedding lookup. This is what I am doing now,

    embedding = nn.Embedding(vocab_size, embdim)
    if flag:
        embedded = embedding(input_seqs)
    else:
        embedded = torch.matmul(input_seqs, embedding.weight.data)

when flag = True I will be passing normal input sequence as indices of size (batch x max_seq_len), while in other case it will be a smoothened onehot vectors of size (batch x max_seq_len x vocab_size).

Is this the correct way to do it?

Thanks

This throws an error for embedding.weight.data:
RuntimeError: mm(): argument ‘mat2’ (position 1) must be Variable, not torch.cuda.FloatTensor

Any suggestions?

Try using embedding.weight which is a Variable instead of embedding.weight.data which is a Tensor.

yes…that worked. But in general do you think what I am doing is correct?

If you are not using any of the more fancy options to Embedding, namely any of these padding_idx=None, max_norm=None, norm_type=2, scale_grad_by_freq=False, then your code looks plausibly correct to me.

That said, I am not particularly familiar with the inner workings of Embeddings, so I won’t give a clear definitive answer on that point.