Issue about BPR loss

Hi,

I worked on implementing bayesian pairwise (BPR) loss function and have some problems:

  1. Is the number of negative item a fixed number for all users?
  2. Is the number of positive item same as the number of negative item?
  3. When I backward the loss, it is almost 0 (like 5e-7, 6e-8), how to deal with it?

The code snippet is as following:

import torch
import torch.nn as nn

emb_size = 8
num_pos = 6
num_neg = 3

user_embedding = torch.randn(emb_size)
pos_embedding = torch.randn(num_pos, emb_size)
neg_embedding = torch.randn(num_neg, emb_size)

pos_score = (user_embedding * pos_embedding).sum(1)
# pos_score = [num_pos]
neg_score = (user_embedding * neg_embedding).sum(1)
# neg_score = [num_neg]

bpr_loss = nn.LogSigmoid()(pos_score.sum() - neg_score.sum())
bpr_loss = -1 * bpr_loss

Thanks in advance for help.