Dataloader for multi label data in Pytorch

Hi,
My data has multi labels in range of 1 to 4 labels per image.
I have been using one hot encoding of labels to obtain dataloader.

I am after few customised loss functions now, such as →
class ArcFaceLoss(nn.modules.Module):
def init(self, s=30.0, m=0.5):
super().init()
self.crit = nn.CrossEntropyLoss()#DenseCrossEntropy()#BCEWithLogitsLoss()#
self.s = s
self.cos_m = math.cos(m)
self.sin_m = math.sin(m)
self.th = math.cos(math.pi - m)
self.mm = math.sin(math.pi - m) * m

def forward(self, logits, labels):
    logits = logits.float()
    cosine = logits
    sine = torch.sqrt(1.0 - torch.pow(cosine, 2))
    phi = cosine * self.cos_m - sine * self.sin_m
    phi = torch.where(cosine > self.th, phi, cosine - self.mm)

    output = (labels * phi) + ((1.0 - labels) * cosine)
    output *= self.s
    loss = self.crit(output, labels)
    return loss / 2

It seems, this loss function doesn’t take labels in one hot encoding, if i drop it, then i get an error
‘RuntimeError: stack expects each tensor to be equal size, but got [1] at entry 0 and [3] at entry 1’

May i know if there is any other way to obtain dataloader for multilabel data. Thank you.

nn.CrossEntropyLoss expects class indices in the range [0, nb_classes-1] instead of one-hot encoded tensors, so loss = self.crit(output, labels) would raise an error.
However, I guess the posted error message is raised even before reaching this line of code, so could you check the line of code raising it and then the shapes of all used tensors in this operation?

Thanks for the inputs.
This error appears at the execution of def getitem(self,index) function for the dataloader.
It is because images in the data set has different number of labels, and therefore they have different length. I am looking for the alternate ways to encode labels so that i can use other loss functions than nn.BCEWithLogitsLoss().
Thanks

I’m not sure why the targets would have a different length, since also one-hot encoded targets should contain values for all classes (0 for inactive, 1 for active classes).
Could you explain the targets and their shape a bit more?

Yes, sure.
If I use one hot encoding for targets, then there is no such error because the length of the vector is same.
But if i drop this encoding and just use image labels as such, for example:
Image Number 1, Label: (0, 4)
Image Number 2, Label: (1, 3, 4)
Image Number 3, Label: (1, 5, 7, 8)

Then the error appears.

Thanks