Binary cross entropy with logits with weights

Hi !

I am currently working with the function torch.nn.functional.binary_cross_entropy_with_logits torch.nn.functional.binary_cross_entropy_with_logits — PyTorch 2.2 documentation and I have some questions. I am not sure that I have correctly grasp the difference between pos_weight and weight.
pos_weight : used to give a bigger weight to the positive class than to the negative class
weight : Used to give different weight to each batch ?

  1. When using pos_weight, what is the weight that is given to the negative class ? (Is it 1 ?)
  2. Is it possible to change the weigth of the negative class using function torch.nn.functional.binary_cross_entropy_with_logits ?
  3. Is it the right way to use it in a code ?
    (N_0 = number of instances in class 0, N_1 : number of instances in class 1)
    weight_1 = torch.tensor([N_0/N_1])
    loss = F.binary_cross_entropy_with_logits(logits, y_true, pos_weight = weight_1)

Thank you all for your help :slight_smile: !