From my understanding
manual_seed sets seeds for RNG, and
torch.backends.cudnn.deterministic = True makes cuDNN’s output deterministic – fixed when given the same inputs and outputs.
Since there is no RNG factor in convolution (cuDNN documentation on convolution forward doesn’t mention anything about taking a seed as input or RNG of convolution), setting
manual_seed should make no difference. However, when we test 2 settings, the results are not as expected:
torch.backends.cudnn.deterministic = Truethe output is NOT fixed on every iteration. It deviates on every iteration.
torch.backends.cudnn.deterministic = Trueand
manual_seedset, the output is fixed on every iteration
Given the same input & weight (yes, we manually gave weight), and with
torch.backends.cudnn.deterministic = True turned on, the output of
weight = # some code that reads weight file conv = nn.Conv1D(...) conv.weight.data = weight
changes everytime it is called:
- iteration 1
[-3.0552e+00, -5.3343e+00, -6.5944e-01, 1.1911e+00, 4.3999e+00, 1.4698e+00, -3.8650e+00, 1.4742e+00, 1.2590e+00, 5.3744e+00, -1.1283e+01, 1.1128e+01, -1.3646e+01, 1.2124e+00, -9.6420e-01, 7.5311e+00, -5.4766e+00, 2.8123e+00, -9.1796e+00, 6.2736e+00, ...
- iteration 2
-2.9771e+00, -5.2562e+00, -5.8136e-01, 1.2692e+00, 4.4779e+00, 1.5479e+00, -3.7869e+00, 1.5522e+00, 1.3371e+00, 5.4525e+00, -1.1205e+01, 1.1206e+01, -1.3568e+01, 1.2905e+00, -8.8612e-01, 7.6092e+00, -5.3985e+00, 2.8903e+00, -9.1016e+00, 6.3517e+00, ...
You can see that the outputs are different.
However, when we set the random seed with:
torch.manual_seed(0), then the output becomes identical on every iteration.
why does the output differ given the same inputs and weights, and with
torch.backends.cudnn.deterministic = True?
torch.manual_seed(0)make the outputs identical?