Difference between BCEWithLogits and CrossEntropy

I thought the following code would output the same result, but it did not.
What is the difference between BCEWithLogits and CrossEntropy loss used for classification?

loss1 = nn.BCEWithLogitsLoss()
loss2 = nn.CrossEntropyLoss()

input = torch.randn(11,3, requires_grad=True)
target = torch.empty(11,3).random_(2)
output1 = loss1(input, target)
output2 = loss2(input, target)

Hi,
CrossEntropyLoss is used in multi-class (single-label) problems where one and only one class out of the possible classes is the right answer.

BCEWithLogitsLoss is used in multi-class multi-label problems where a data point could belong to one or more than one classes.