Making fully connected layer bi-directional

As far as my knowledge, when we make nn.LSTM(.. hidden_dim, bidirectional = True) we tend to make the next fully connected layer self.fc1 = nn.Linear(hidden_dim*2..), with this I believe we make this fully connected layer bi-directional as well. If so, then could we make self.fc2 bi-directional as well?

Please rectify me if I have wrong understanding.

You are doubling the number of input features, as the preceding LSTM gives you num_directions * hidden features.
The linear layer itself is not “birirectional” in the sense as an LSTM. The incoming activation won’t be processed in a sequential/directional manner.
You could of course increase the number of hidden units in the second linear layer as well.

1 Like