if this is our network:

`class Net(nn.Module): def __init__(self): super(Net, self).__init__() self.conv1 = nn.Conv2d(1, 10, kernel_size=5) self.conv2 = nn.Conv2d(10, 20, kernel_size=5) self.conv2_drop = nn.Dropout2d() self.fc1 = nn.Linear(320, 50) self.fc2 = nn.Linear(50, 10) self.fc4 = nn.Linear(320, 40) self.fc5 = nn.Linear(40, 10) def forward(self, x): x = F.relu(F.max_pool2d(self.conv1(x), 2)) x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2)) x = x.view(-1, 320) x = F.relu(self.fc4(x)) x = F.dropout(x, training=self.training) x = self.fc5(x) return F.log_softmax(x) model = Net() if args.cuda: model.cuda()`

when you use **print(model)**

it just print the network components where you defined in **init**, like this:

`Net ( (conv1): Conv2d(1, 10, kernel_size=(5, 5), stride=(1, 1)) (conv2): Conv2d(10, 20, kernel_size=(5, 5), stride=(1, 1)) (conv2_drop): Dropout2d (p=0.5) (fc1): Linear (320 -> 50) (fc2): Linear (50 -> 10) (fc4): Linear (320 -> 40) (fc5): Linear (40 -> 10) )`

they are not the real data flow in the network. Please notice that i do not use **fc1** and **fc2**, i use **fc4** and **fc5**.

How can i know the real data flow gragh in my network? i need know what my network looks like. There was an embarassing bug that my network was not connected in the way i thought. It ends in the half way during the backprop prosess.