Strange result of Conv2d

I’m write some short script of Conv2d, and got a strange result.

import torch
import torch.nn as nn
import numpy as  np

torch.random.manual_seed(1)

a = [1 ** i for i in range(9)]
npx = np.array(a, dtype=np.float32).reshape(1, 1, 3, 3)
t = torch.FloatTensor(npx)

c = nn.Conv2d(1, 1, 1)
c.weight = nn.Parameter(((c.weight > 0) >= 0).float())

then I get the output of c(t):


where I expect all of the elements should be 1.

The result changes according to the torch.random.manual_seed, and it happens in pytorch 0.4.0 and 1.0

Did I misunderstand something ?

Thanks!

You forgot bias

print(c.bias)

you could set bias to False by

c = nn.Conv2d(1, 1, 1, bias=False)

will produce your expected result.

Thank you. It works.