About weight in BCEWithLogitsLoss to handle imbalanced samples

I read document of BCEWithLogitsLoss and I find “weight” parameter is different with CrossEntropLoss’s “weight”. In BCEWithLogitsLoss, “weight” has to be a Tensor of
size nbatch , but in CrossEntropLoss, “weight” is a shape of C (number of classes). I first not understand “nbatch” meaning, number of batch or batchsize. After trying, I
find it should be batchsize. I give a Tensor with N*C (batchsize * class_weights) size
as “weight” to BCEWithLogitsLoss and the loss function is running without any
problems.

But I am still worried about whether I do this correct or not. So my question is
The “weight” parameter in BCEWithLogitsLoss do the same effort (balance imbalance
sample) as the one in CrossEntropLoss’s, right? If so, why they are different. I think
this may bother some user as me.

I would be quite appreciate if someone could help me. Thanks!