Indexing a 2D tensor with a smaller 2D tensor

Hi folks, I’d like some help on indexing a 2D tensor. Suppose we have the following tensors

d = torch.tensor([[118, 175, 1], [118, 188, 0], [ 66, 201, 1], [ 94, 204, 1], [ 94, 206, 0]]) e = torch.tensor([[66, 201, 0.1], [94, 206, 0.2], [1, 23, 0.6], [118, 188, 0.3], [2, 3, 0.1], [3, 1, 0.2], [94, 204, 0.8], [118, 175, 0.7]])

Tensor e is a set of predictions from a model where the first 2 columns are sort of an ID and tensor d is a set of labels with first 2 columns as an ID. So I have less labels than predictions here and I’d like to only calculate the loss for the real labels. Hence, I need to index tensor e by tensor d's first 2 columns. Is there a way to do this? Or would using a 3D tensor be better?

Can you clarify “I’d like to only calculate the loss for the real labels” with an example?

Sure. So we have the following tensors

d = torch.tensor([[118, 175, 1], 
                  [118, 188, 0], 
                  [ 66, 201, 1], 
                  [ 94, 204, 1], 
                  [ 94, 206, 0]]) 

e = torch.tensor([[66, 201, 0.1], 
                  [94, 206, 0.2], 
                  [1, 23, 0.6], 
                  [118, 188, 0.3], 
                  [2, 3, 0.1], 
                  [3, 1, 0.2], 
                  [94, 204, 0.8], 
                  [118, 175, 0.7]])

For tensor d, the first 2 columns are the indexes, and third column is the true label. Hence, I’d like to get a prediction tensor such that we index tensor e, and get

prediction = torch.tensor([[118, 175, 0.7], 
                           [118, 188, 0.3], 
                           [ 66, 201, 0.1], 
                           [ 94, 204, 0.8], 
                           [ 94, 206, 0.2]])

I can then calculate the CE-loss by taking

bce = nn.BCEWithLogitsLoss(pos_weight=pos_weight)
loss = bce(prediction[:, 2], d[:, 2])