is there a way to use a different weight for each sample in a minibatch for BCELoss instead of loop through the minibatch?
It might not be as efficient, but you could just hard-code the BCE instead of using the built-in BCEloss and then multiply each loss term in the batch by your desired coefficient before you sum it up. This doesn’t require a loop, just an elementwise multiply.
What problems are you trying to solve?
Supposing you want to give rare label combinations more weight, you can pass a weight parameter to BCELoss. If you have 4 label A, B, C, D, you can do
criterion = torch.nn.BCELoss(weight = torch.Tensor([1, 2, 3, 4])).cuda()
Supposing you really want more weight for some specific samples you can use
WeightedRandomSampler to have them appear more frequently in your batch.
Otherwise do as @ajbrock says and roll up your custom code
Ah just saw! Thanks for the reply! I ended up feed in all the batches as one flat batch though
Thanks for your reply.
I just have one more question, what’s the difference or just said as final performance ：) ，between the two methods?
For my experience, the WeightedRandomSampler will be a bit better , but i could not make sure the results.