Pytorch Tutorial Error. What's the different between the following codes?

Hello, I have learn the tensorboard with tensorboard_tutorial,there is a class Net, the official code as following:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 6, 5)
        self.pool = nn.MaxPool2d(2, 2)
        self.conv2 = nn.Conv2d(6, 16, 5)
        self.fc1 = nn.Linear(16 * 4 * 4, 120)
        self.fc2 = nn.Linear(120, 84)
        self.fc3 = nn.Linear(84, 10)

    def forward(self, x):
        x = self.pool(F.relu(self.conv1(x)))
        x = self.pool(F.relu(self.conv2(x)))
        x = x.view(-1, 16 * 4 * 4)
        x = F.relu(self.fc1(x))
        x = F.relu(self.fc2(x))
        x = self.fc3(x)
        return x
net = Net()

well, it’ work well when I run to the codewriter.add_graph(net, images), I recently see another way to implement it, my personal code as following

    def __init__(self):
        super(Net, self).__init__()
        self.feature1 = nn.Sequential(
            nn.Conv2d(1,6,5),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(2,2),
            nn.Conv2d(6,16,5),
            nn.ReLU(inplace=True),
            nn.MaxPool2d(2,2)
        )
        self.feature2 = nn.Sequential(
            nn.Linear(256, 120),
            nn.ReLU(inplace=True),
            nn.Linear(120,84),
            nn.ReLU(inplace=True),
            nn.Linear(84,10)
        )
    def forward(self, x):
        x  = self.feature1(x)
        x.view(-1, 256)
        x = self.feature2(x)
        return x

I use the nn.Sequential,when I run to the code writer.add_graph(net, images), it report Error as follows

RuntimeError: size mismatch, m1: [256 x 4], m2: [256 x 120] at C:\w\1\s\tmp_conda_3.7_055457\conda\conda-bld\pytorch_1565416617654\work\aten\src\TH/generic/THTensorMath.cpp:752

Process finished with exit code 1

So what is the different between them?
Thanks for you answer my question~~~

You are missing the assignment x = x.view(-1, 256)

thanks,I’m not notice that. :joy: