Given network architecture, what are the possible ways to define fully connected layer fc1
to have a generalized structure such as nn.Linear($size_of_previous_layer$, 50)
?
The main issue arising is due to x = F.relu(self.fc1(x))
in the forward function. After using the flatten
, I need to incorporate numerous dense layers. But to my understanding, self.fc1
must be initialized and hence, needs a size (to be calculated from previous layers). How can I declare the self.fc1
layer in a generalized manner?
My Thought:
To get the size, I can calculate the size of the outputs from each of Convolution layer, and since I have just 3, it is feasible. But, in case of n layers, how can you get the output size from the final convolutional layer?
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.conv1 = nn.Conv2d(3, 10, kernel_size=3, padding = 1)
self.conv2 = nn.Conv2d(10, 20, kernel_size=3, padding = 1)
self.conv2_drop = nn.Dropout2d(0.4)
self.conv3 = nn.Conv2d(20, 40, kernel_size=3, padding = 1)
self.conv3_drop = nn.Dropout2d(0.4)
self.fc1 = nn.Linear(360, 50) # self.fc1 = nn.Linear($size_of_previous_layer$, 50)
def forward(self, x):
x = F.relu(F.max_pool2d(self.conv1(x), 2))
x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
x = F.relu(F.max_pool2d(self.conv3_drop(self.conv3(x)), 2))
x = x.flatten(1)
x = F.relu(self.fc1(x))
return F.log_softmax(x)
- Input to the following architecture can assumed to be [3, 32, 32] (num_of_channels, height, width).
@ptrblck, could you help me?
PS:
- For single convolutional layer, it is quite easy. The question refers, if you have n convolutional layers.