zhoulukuan
(Zhoulukuan)
#1
I want to find out how many numbers which are nonzero in the tensor：

```
output = model(input)
output = output.reshape(-1)
result1 = sum(output != 0)
result2 = torch.sum(output != 0)
```

However, I found sometimes result1 and result2 are different and the result1 is zero even though there are many nonzero elements in the output. Why?

maybe,

```
len(output.nonzero())
```

also, what case did you get different result1 and result2?

zhoulukuan
(Zhoulukuan)
#3
Thanks for your advice!

I feel very confused because I cannot reproduce it stably. This is the problem I encountered when reproducing the transformer.

it might have to do with dropout or with seed,

for example,

```
x = torch.randn(1, 1, 28, 28)
c = nn.Sequential(nn.Conv2d(1, 10, 1), nn.Dropout())
c(x).sum(), c(x).sum()
```

would give different output

whereas if model is set to eval mode, that is,

```
c.eval()
```

then,

```
c(x).sum(), c(x).sum()
```

would give same output

for seed, if you use something like,

```
seed = 0
random.seed(seed)
np.random.seed(seed)
torch.manual_seed(seed)
torch.random.manual_seed(seed)
torch.cuda.manual_seed(seed)
torch.backends.cudnn.deterministic=True
```

then it would be fine

1 Like

zhoulukuan
(Zhoulukuan)
#5
Thank you, I will have a try.