# Output of nn.Module

I put a simple model below

``````class NetA(nn.Module):
def __init__(self):
super(NetA, self).__init__()
self.conv1 = nn.Conv2d(3, 6, (5, 5))
self.conv2 = nn.Conv2d(6, 16, (5, 5))
self.drop1 = nn.Dropout(p=0.5)
self.fc1 = nn.Linear(16 * 5 * 5, 64)
self.fc2 = nn.Linear(64, 3)
def forward(self, x):
x = F.max_pool2d(F.relu(self.conv1(x)), 2)
x = self.drop1(x)
x = F.max_pool2d(F.relu(self.conv2(x)), 2)
x = x.view(-1, 16 * 5 * 5)
x = F.relu(self.fc1(x))

if 1: # pattern1
x = self.fc2(x)
else: #pattern2
x = F.softmax(self.fc2(x), dim=1)

return x
``````

If I input certain random value to the model, an output is [[0. 0. 0.]].
But, when activating the last line, and inputting certain random value, the output is, for instance, [[0.2243069 0.3878466 0.3878466]].

My question is why “x = F.softmax(self.fc2(x), dim=1)” gives me some values although x = self.fc2(x), the previous function in the model, gives exsactly 0,0,0 ??

Hi, can you please specify the exact input size you are working with?
Also, are you sure it’s exactly 0, 0, 0 and not some values that are displaying zero due to limited precision?

the input is Input [input] shape: (1, 3, 32, 32)
with np.random.random(input_shape)

and output values are
[[0.000000000000000000000000
0.000000000000000000000000
0.000000000000000000000000]]

There was my wrong explanation.
The pattern1 is “x = self.fc2(x); return x”
This gives 0,0,0.

The pattern2 is “x = F.softmax(self.fc2(x), dim=1); return x”
This gives some values although self.fc2(x) gives 0,0,0