Different dimension when using CosineEmbeddingLoss

Hi guys,

Is it possible to use inputs with different dimension when using CosineEmbeddingLoss.

I mean the size of x1 and x2 are different. For example, x1 is 10x3, where 10 is the batch size and 3 is embedding dimension. And x2 is 10x5x3, where 5 is the negative sampling size.

Thanks

You may wanna try something like broadcast

TRY:

cosine_embedding_loss(x1.view(10,1,3), x2)
# x1 would broadcast to (10,5,3) as x2

BTW, it seems something is rong with your format.
use 10x 3 instead of 10*3

1 Like

Thank you so much! I will try your suggestion and I’ve already fixed the * into x.