[Solved] Multi label classification won't work well

This topic was solved because I found trivial mistakes. If you have permission to delete this topic, please do so.


Hi, I’m trying multi-label image as training: one image have k>0 image labels.
I use binary cross entropy loss for each class labels,

L = - t log y - (1 - t) log (1 - y)

,where t is {0, 1} target label, y is sigmoid output of the model.

but training gets no progress like following image(x-axis shows 10^2 iters):

IMG_2333

Following are model settings:

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()
        self.conv1 = nn.Conv2d(3,64,(3, 3),(1, 1),(1, 1))
        self.conv2 = nn.Conv2d(64,64,(3, 3),(1, 1),(1, 1))
        self.conv2_drop = nn.Dropout(0.25)
        self.pool1 = nn.MaxPool2d((4, 4),(4, 4))
        self.bn1 = nn.BatchNorm2d(64,0.001,0.9,True)
        self.conv3 = nn.Conv2d(64,128,(3, 3),(1, 1),(1, 1))
        self.conv4 = nn.Conv2d(128,128,(3, 3),(1, 1),(1, 1))
        self.conv4_drop = nn.Dropout(0.25)
        self.pool2 = nn.MaxPool2d((4, 4),(4, 4))
        self.bn2 = nn.BatchNorm2d(128,0.001,0.9,True)
        self.conv5 = nn.Conv2d(128,256,(3, 3),(1, 1),(1, 1))
        self.conv6 = nn.Conv2d(256,256,(3, 3),(1, 1),(1, 1))
        self.conv6_drop = nn.Dropout(0.25)
        self.pool3 =nn.MaxPool2d((4, 4),(4, 4))
        self.bn3 = nn.BatchNorm2d(256,0.001,0.9,True)
        self.conv7 = nn.Conv2d(256,128,(3, 3),(1, 1),(1, 1))
        self.linear1 = nn.Linear(3072,128)
        self.linear2 = nn.Linear(128, N_tags)
        self.sigmoid = nn.Sigmoid()

I use

  • Adam as optimizer and set learning_rate as {0.1, 0.01, 0.001, 0.0001} but all of them show loss progress like above.
  • batch_size as 50

Any help would be appreciated.
Thanks.

Postscript

It was because I forgot to add loss.backward() !