Backpropogating the loss of specific classes

Hello,

In my model , i want to backpropogate loss of specific classes only. Say if I have total 5 classes for classification. Once classifier outputs the value, in the loss computation I want to take only selected 3 classes and backpropogate the loss, others should be masked. Is there a way to achieve this in pytorch.

Thanks

Hi, I don’t know if this is the best solution, but maybe you can pass a mask as an input to the forward function:

def forward(self, x, mask):
    # do something
    return mask * outputs

And than use the same mask on the targets:

masked_outputs = model(inputs, mask)
masked_targets = mask * targets
loss = criterion(masked_outputs, masked_targets)

I haven’t tried this is PyTorch yet, please tell me if it works for you!

Hey, thanx for replying, I thought of doing the same , but I am not sure how it will react when the inputs are shuffled. Is there a way for kinda seed value to use during , training , for dataset enumerator.

I think I misunderstood your question. Did you mean one of these?

  1. Ignore samples in a batch that are not from the selected classes

  2. Use all samples per batch, but backpropage only trough the dimensions out the output vector corresponding to the selected classes (similar to what some people do in deep Q-learning to backpropagate only trought a single action)

I was thinking about option 2. Btw, I think it’s not very simple to fix random seed for dataloader if you use multiple workers. This thread might have some work arounds.