BCEWithLogitsLoss inherits_Loss and not _WeightedLoss

hey, while working on my own loss function I have notice that
CrossEntropyLoss and BCELoss inherits _WeightedLoss (which inherits _Loss)
while BCEWithLogitsLoss inherits the _Loss class.
The only difference between _WeightedLoss and _Loss is the weight param and registering it in self’s buffer.
in BCEWithLogitsLoss initiallization, we initialize a wight param and pos_weight param and use those exactly in the same way as in _WeightedLoss.

Is there a reason for this? should BCEWithLogitsLoss also inherits _WeightedLoss ?

I am using torch 2.0.1
Thanks!