A neural network with single layer of ConvTranspose2d output different result in 2 evaluations on GPU

Hi experts,

Here I have a neural network with 1 single layer of nn.ConvTranspose2d. It outputs different results in 2 repeated runs on GPU, but outputs the same results in 2 repeated runs on CPU. I want to understand what is the reason behind this? Do we usually ignore the difference? Thanks.

ngf=1
nz=1
device=torch.device('cuda')
class Generator_test_3(nn.Module):
    def __init__(self):
        super(Generator_test_3, self).__init__()

        self.main3=nn.Sequential(
            nn.ConvTranspose2d( ngf , ngf, 4, 2, 1, bias=False),
            )
        
    def forward(self, input_noise):


        input=self.main3(input_noise)


        return input
net_test_3 = Generator_test_3().to(device)

torch.manual_seed(0)
noise=torch.randn(1,nz,8,8).to(device=device)
torch.sum((net_test_3(noise)-net_test_3(noise))!=0)

I get exactly the same results in my setup, but note that you might be running into the expected small errors caused by the limited floating point precision.
Take a look at the Reproducibility docs to see how deterministic behavior can be achieved.

1 Like