Nn.linear not changing the size of the tensor

I have a basic network, but for some reason when I pass the input through a linear layer, the dimensions are not changing. Did I make a beginner mistake somewhere?
The code is below with what I saw printed out.

class ImageOnlyNetwork(nn.Module):

    def __init__(self, d_word, d_hidden):
        super(ImageOnlyNetwork, self).__init__()

        self.d_hidden = d_hidden # 256

        self.context_fc7_linear = nn.Linear(4096, d_hidden)

        # Ignore these two
        self.context_fc7_lstm = nn.LSTM(4096, d_hidden, 1)
        self.answer_embedding = nn.Embedding(vocab_len, d_word)

    def forward(self, context_fc7, answer_fc7, answers, a_mask):

        print("context fc7 size: ", context_fc7.size()) # Prints (64L, 3L, 4096L)

        context_length = context_fc7.size()[1] # 3

        context_fc7 = context_fc7.view(-1, context_fc7.size(2))
        print("context fc7 size after view: ", context_fc7.size()) # Prints (192L, 4096L)

        context_fc7_lin = self.context_fc7_linear(context_fc7) 

        context_fc7_lin = F.relu(context_fc7)
        print("contextfc7lin size: ", context_fc7_lin.size()) # Prints (192L, 4096L)), but should be (192L, 256L)

should be

context_fc7_lin = F.relu(context_fc7_lin)
1 Like

That was very silly. I should have checked that better. Thanks!

An easy mistake to make.