How to make BatchNorm Layer NOT learn from a specified-labelled inputs?

Assume that I have got a mini-batch of inputs with labels in {1,2, …,N}. I need the Label-N inputs have no impact on the update of BN layer parameters. Is there an easy way to reach my goal? Thanks a lot.

Btw, I tried an approach, which i finally find wrong. I pick out the Label-N inputs, set the model under evaluation mode by “model.eval()”, and let the Label-N inputs go forward through the model; Then I set the model back to training mode, input the rest data to generate the loss. But in the following back propagation stage, in which the parameters get updated, I guess the BN layer still learns from all the inputs. Is my guess correct?

if you sent label-N inputs through when the model was in eval mode, then those inputs are not used for backprop (and hence BN layer wont learn from those inputs)