Input size in first layer of nn.Linear

Hello :slight_smile:

If I want to construct the model like this

class MyNeuralNet(nn.Module):
    def __init__(self):
        ## Some Layer before...
        self.layer_1 = nn.Linear(input_size,hidden_size)
        self.layer_2 = nn.Linear(hidden_size,output_size)

    def forward(self,x):
        ## Some Layer before...
        x = self.layer_1(x)
        x = self.layer_2(x)
        return x

but I don’t want to change input_size every time when I change the data or change hyperparameter setting from previous layer. What should I do?

Thank you

make it an arg to __init__?

If I can init input size in forward method?

This is not possible because this way you would have to create a new layer every time which results in new (yet untrained) parameters.
You could add a global average pooling before the FC layers and give it a fixed output size (depending on the input size of your FC layer)

@justusschock Thank you. :slight_smile: