Shape of Sigmoid activation function in NN

Hi,
I try to build NN for binary classification. here is my code:

CLASSES_LENGTH = 1

class NN_FullyConnected(nn.Module):
    def __init__(self):
        super().__init__()
        self.Input_to_Hidden1 = nn.Linear(DIMENSION, int(DIMENSION))
        self.DropOut1 = nn.Dropout(p=0.5)
        self.Hidden1_to_Hidden2 = nn.Linear(int(DIMENSION), int(DIMENSION))
        self.DropOut2 = nn.Dropout(p=0.5)
        self.Hidden2_to_Hidden3 = nn.Linear(int(DIMENSION), int(DIMENSION))
        self.DropOut3 = nn.Dropout(p=0.5)
        self.Hidden3_to_Output = nn.Linear(int(DIMENSION), CLASSES_LENGTH)

    def forward(self, X):
        # set_trace()
        x_inp = nn.ReLU()(self.Input_to_Hidden1(X))
        x_d1 = self.DropOut1(x_inp)
        x_hid1_2 = nn.ReLU()(self.Hidden1_to_Hidden2(x_d1))
        x_d2 = self.DropOut2(x_hid1_2)
        x_hid2_3 = nn.ReLU()(self.Hidden2_to_Hidden3(x_d2))
        x_d3 = self.DropOut3(x_hid2_3)
        x_out = self.Hidden3_to_Output(x_d3)
        y =nn.Sigmoid()(x_out)
        return y

In the debug, I saw that the output shape (y) is ([10003,1])

image

I expect to get only ([10003]). what I`m doing wrong?
Thank you!!

Why do you expect to get that shape and not [10003, 1]? What if the number of calsses is 2? It would be [10003, 2], right? So for one class I think it’s ok to have [10003, 1].

Hi @Isaac_Kargar, Thank you for your answer.
I know, but I got a problem in the loss function:
my code:
loss = criterion(y_pred, y_real.to(device))

my criterion is BCE

ValueError: Target size (torch.Size([10003])) must be the same as input size (torch.Size([10003, 1]))

Thats why I started to consider about the output from the NN.
Do I need to reshape the output before using the criterion function?

Thank you

Yes you can do it using y_real.view(-1, 1).

1 Like