I put a simple model below

```
class NetA(nn.Module):
def __init__(self):
super(NetA, self).__init__()
self.conv1 = nn.Conv2d(3, 6, (5, 5))
self.conv2 = nn.Conv2d(6, 16, (5, 5))
self.drop1 = nn.Dropout(p=0.5)
self.fc1 = nn.Linear(16 * 5 * 5, 64)
self.fc2 = nn.Linear(64, 3)
def forward(self, x):
x = F.max_pool2d(F.relu(self.conv1(x)), 2)
x = self.drop1(x)
x = F.max_pool2d(F.relu(self.conv2(x)), 2)
x = x.view(-1, 16 * 5 * 5)
x = F.relu(self.fc1(x))
if 1: # pattern1
x = self.fc2(x)
else: #pattern2
x = F.softmax(self.fc2(x), dim=1)
return x
```

If I input certain random value to the model, an output is [[0. 0. 0.]].

But, when activating the last line, and inputting certain random value, the output is, for instance, [[0.2243069 0.3878466 0.3878466]].

My question is why “x = F.softmax(self.fc2(x), dim=1)” gives me some values although x = self.fc2(x), the previous function in the model, gives exsactly 0,0,0 ??