Output of cnn is negative

tensor([[[-2.1100, -2.0569, -2.1977, -2.1016, -2.0881, -2.1053, -2.0627,
-2.1315, -2.1637, -2.1518],
[-2.1073, -2.0807, -2.1757, -2.1125, -2.1710, -2.1226, -2.1058,
-2.1105, -2.1899, -2.0638],
[-2.1084, -2.0598, -2.1171, -1.9930, -2.1466, -2.1421, -2.2110,
-2.1488, -2.1378, -2.0872],
[-2.1294, -2.1019, -2.0210, -2.0828, -2.1360, -2.1737, -2.1516,
-2.1104, -2.2015, -2.1766]],

    [[-2.2347, -2.1515, -2.2699, -2.1849, -2.1745, -2.1778, -2.1453,
      -2.2436, -2.2160, -2.2708],
     [-2.1920, -2.2320, -2.2267, -2.1805, -2.1464, -2.1578, -2.2395,
      -2.1214, -2.2896, -2.1644],
     [-2.1465, -2.1630, -2.2229, -2.1454, -2.2178, -2.1901, -2.2487,
      -2.2327, -2.2180, -2.1987],
     [-2.2261, -2.1394, -2.0773, -2.1365, -2.2153, -2.2789, -2.2637,
      -2.1818, -2.2304, -2.2258]],

this is output of cnn with loss function=nn.BCEWithLogitsLoss()

tensor([[[0.1709, 0.1671, 0.1541, 0.1607, 0.1622, 0.1673, 0.1629, 0.1589,
0.1618, 0.1577],
[0.1631, 0.1658, 0.1580, 0.1655, 0.1627, 0.1625, 0.1589, 0.1713,
0.1580, 0.1706],
[0.1670, 0.1695, 0.1612, 0.1720, 0.1635, 0.1591, 0.1608, 0.1624,
0.1621, 0.1595],
[0.1570, 0.1709, 0.1782, 0.1748, 0.1628, 0.1625, 0.1655, 0.1632,
0.1527, 0.1606]],

    [[0.0781, 0.0832, 0.0763, 0.0851, 0.0800, 0.0796, 0.0784, 0.0775,
      0.0751, 0.0807],
     [0.0810, 0.0764, 0.0735, 0.0796, 0.0740, 0.0773, 0.0757, 0.0826,
      0.0772, 0.0826],
     [0.0812, 0.0796, 0.0788, 0.0864, 0.0742, 0.0796, 0.0714, 0.0780,
      0.0784, 0.0829],
     [0.0777, 0.0786, 0.0870, 0.0794, 0.0759, 0.0753, 0.0749, 0.0793,
      0.0758, 0.0745]],

and this is what i transformed with using F.sigmoid()
I want to classify multiclass image from using sigmoid function to binary, but all of elements are less than 0.5 so every binary is 0
what is problem?

It’s unclear what might be causing the model to predict class0 without any information about your use case, but I would probably start by checking if your dataset is imbalanced and the majority class would be class0.

For a multiclass (not multilabel) problem, you would use softmax. So output = output.softmax(dim=-1) should give you probabilities

NIT: you can use softmax to compute the probabilities given the returned logits, but you should not pass them to nn.CrossEntropyLoss as raw logits are expected for a multi-class use case.