How can i know data flow in the network?

if this is our network:

class Net(nn.Module):
    def __init__(self):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
        self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
        self.conv2_drop = nn.Dropout2d()
        self.fc1 = nn.Linear(320, 50)
        self.fc2 = nn.Linear(50, 10)
        self.fc4 = nn.Linear(320, 40)
        self.fc5 = nn.Linear(40, 10)

    def forward(self, x):
        x = F.relu(F.max_pool2d(self.conv1(x), 2))
        x = F.relu(F.max_pool2d(self.conv2_drop(self.conv2(x)), 2))
        x = x.view(-1, 320)
        x = F.relu(self.fc4(x))
        x = F.dropout(x, training=self.training)
        x = self.fc5(x)
        return F.log_softmax(x)

model = Net()
if args.cuda:
    model.cuda()

when you use print(model)
it just print the network components where you defined in init, like this:

Net (
  (conv1): Conv2d(1, 10, kernel_size=(5, 5), stride=(1, 1))
  (conv2): Conv2d(10, 20, kernel_size=(5, 5), stride=(1, 1))
  (conv2_drop): Dropout2d (p=0.5)
  (fc1): Linear (320 -> 50)
  (fc2): Linear (50 -> 10)
  (fc4): Linear (320 -> 40)
  (fc5): Linear (40 -> 10)
)

they are not the real data flow in the network. Please notice that i do not use fc1 and fc2, i use fc4 and fc5.

How can i know the real data flow gragh in my network? i need know what my network looks like. There was an embarassing bug that my network was not connected in the way i thought. It ends in the half way during the backprop prosess.

Does this post help?

Thank you for your help.

No. i’ve tried that code. it dosen’t work.

Then I don’t think there’s an existing tool/script that can visualize the data flow.

This may also be helpful:

A result (in my case):

:Sequential (
  (0): Linear (29 -> 1024), weights=((1024L, 29L), (1024L,)), parameters=30720
  (1): Dropout (p = 0.05), weights=(), parameters=0
  (2): Tanh (), weights=(), parameters=0
  (3): BatchNorm1d(1024, eps=1e-05, momentum=0.1, affine=True), weights=((1024L,), (1024L,)), parameters=2048
  (4): Linear (1024 -> 128), weights=((128L, 1024L), (128L,)), parameters=131200
  (5): Dropout (p = 0.05), weights=(), parameters=0
  (6): Tanh (), weights=(), parameters=0
  (7): Linear (128 -> 64), weights=((64L, 128L), (64L,)), parameters=8256
  (8): Dropout (p = 0.05), weights=(), parameters=0
  (9): LeakyReLU (0.01), weights=(), parameters=0
  (10): Linear (64 -> 32), weights=((32L, 64L), (32L,)), parameters=2080
  (11): Dropout (p = 0.05), weights=(), parameters=0
  (12): Tanh (), weights=(), parameters=0
  (13): Linear (32 -> 16), weights=((16L, 32L), (16L,)), parameters=528
  (14): Dropout (p = 0.05), weights=(), parameters=0
  (15): LeakyReLU (0.01), weights=(), parameters=0
  (16): Linear (16 -> 1), weights=((1L, 16L), (1L,)), parameters=17
  (17): Sigmoid (), weights=(), parameters=0
)

Thank you!

But that’s just what i mean, it’s not what i want.

There is a great project! Use dmlc’s standalone tensorboard in pytorch!

2 Likes

I’m aware of this project, but I didn’t know that it can also visualize computing graph. Thanks for the info!

2 Likes