Loss function which acts on only one class

Hi all, I have a network which first classifies samples. After this, these classifier outputs are fed to a regressor. The network is used to decorrelate a classifier output with a specific feature. I however only want to use the regression loss function for the background class (the signal class should not be decorrelated). Is there a way to define a loss function which is normal for one class and constant/zero for another class?

I tried something like this, since I have a batch size of 50:

regression_loss_function(reg_outputs, reg_targets)*(torch.ones([50])-classifier_targets) + classifier_loss_function(classifier_outputs, classifier_targets)

But I realise I understand too little of the Pytorch loss functions to make this work. Any help is appreciated!