[Beginner] Error implementing FC layer after Dropout layer

image
I tried implementing the above model word for word like this:

class TargetA(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(64, 5, 5)
        self.conv2 = nn.Conv2d(64, 5, 5)
        self.drop1 = nn.Dropout(0.25)
        self.fc1 = nn.Linear(128)
        self.drop2 = nn.Dropout(0.5)
        self.fc2 = nn.Linear()

    def forward(self, x):
        x = self.conv1(x)
        x = nn.ReLU(x)
        x = self.conv2(x)
        x = nn.ReLU(x)
        x = nn.Dropout(0.25)(x)
        x = self.fc1(x)
        x = nn.ReLU(x)
        x = nn.Dropout(0.5)(x)
        x = self.fc2(x)
        x = nn.Softmax(x)
        return x

line 51, in init
self.fc1 = nn.Linear()
TypeError: init() missing 2 required positional arguments: ‘in_features’ and ‘out_features’

It doesn’t like how I’m doing the Linear layers. Any help on how to implement the above photo?

Hi, the linear layer requires you to specify how many channels do you need as input and as output.
In your case the convolutional layers are wrong.
You need to set
self.conv1= nn.Conv2d(in_channels,out_channels,kernel_size)
Therefore the output of conv1 will have 5 channels which mismatch the input of conv2.

The fully connected layer (or linear layer)
requires as input
Linear(in_channels,out_channels)
In channels should match conv2’s out channels

Thanks for the response, how would you interpret the first two layers based on the top picture? Is 64 the input and output size? is 5x5 the kernal size?
My guess so far would be:

class TargetA(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = nn.Conv2d(? , ?, ?)
        self.conv2 = nn.Conv2d(?, ?, ?)
        self.drop1 = nn.Dropout(0.25)
        self.fc1 = nn.Linear(,128)
        self.drop2 = nn.Dropout(0.5)
        self.fc2 = nn.Linear(128,10)

Well, from the picture I would say the kernel size is 5x5.
The rest is not really defined. The 2nd convolution should be 64,128 unless they concatanate anything.
The first one should be something to 64