How to make different EmbeddingBag share same embedding matrix?

I would like to perform both the “mean” and “sum” method on one BoW feature. Docs show that EmbeddingBag could be initialized with the same weights using the from_pretrained method. But it seems the underlying THPVariable_make_subclass function would initialize different weight tensors into the computational graph.

The most straightforward way to share weights might be to use the functional interface for the “second invocation”, where you replace the self in the call to F.embedding_bag taken from the forward:

    def forward(self, input, offsets=None, per_sample_weights=None):
        # type: (Tensor, Optional[Tensor], Optional[Tensor]) -> Tensor
        return F.embedding_bag(input, self.weight, offsets,
                               self.max_norm, self.norm_type,
                               self.scale_grad_by_freq, self.mode, self.sparse,
                               per_sample_weights, self.include_last_offset)

That said, I would imagine that for the same features, it’s more efficient to use Embedding + sum/mean manually (or even just use EmbeddingBag with sum and scale yourself to go from sum to mean?)

Best regards

Thomas