Why does convolution's output change even with `deterministic=True`?

Background

From my understanding manual_seed sets seeds for RNG, and torch.backends.cudnn.deterministic = True makes cuDNN’s output deterministic – fixed when given the same inputs and outputs.

Since there is no RNG factor in convolution (cuDNN documentation on convolution forward doesn’t mention anything about taking a seed as input or RNG of convolution), setting manual_seed should make no difference. However, when we test 2 settings, the results are not as expected:

  1. setting 1:
    Only torch.backends.cudnn.deterministic = True the output is NOT fixed on every iteration. It deviates on every iteration.

  2. setting 2:
    torch.backends.cudnn.deterministic = True and manual_seed set, the output is fixed on every iteration

Experiment

Given the same input & weight (yes, we manually gave weight), and with torch.backends.cudnn.deterministic = True turned on, the output of

weight = # some code that reads weight file

conv = nn.Conv1D(...)
conv.weight.data = weight

changes everytime it is called:

example outputs:

  1. iteration 1
[-3.0552e+00, -5.3343e+00, -6.5944e-01,  1.1911e+00,  4.3999e+00,
           1.4698e+00, -3.8650e+00,  1.4742e+00,  1.2590e+00,  5.3744e+00,
          -1.1283e+01,  1.1128e+01, -1.3646e+01,  1.2124e+00, -9.6420e-01,
           7.5311e+00, -5.4766e+00,  2.8123e+00, -9.1796e+00,  6.2736e+00, ...
  1. iteration 2
-2.9771e+00, -5.2562e+00, -5.8136e-01,  1.2692e+00,  4.4779e+00,
           1.5479e+00, -3.7869e+00,  1.5522e+00,  1.3371e+00,  5.4525e+00,
          -1.1205e+01,  1.1206e+01, -1.3568e+01,  1.2905e+00, -8.8612e-01,
           7.6092e+00, -5.3985e+00,  2.8903e+00, -9.1016e+00,  6.3517e+00, ...

You can see that the outputs are different.

However, when we set the random seed with: torch.manual_seed(0), then the output becomes identical on every iteration.

  1. why does the output differ given the same inputs and weights, and with torch.backends.cudnn.deterministic = True?

  2. Why does torch.manual_seed(0) make the outputs identical?

When you use Modules from nn, you use

  • the functions,
  • the parameter-holding of the modules,
  • the default initialization of the parameters.

The latter uses random numbers.

Best regards

Thomas