Hi Guys, I am having the exact same problem using the DETR model and no matter what I try I can’t seem to get reproducible results!.
Hi @ptrblck , None of the solutions given on this post works for me. I have used all the possibilities on PyTorch 1.4.0 and 1.3.1
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
torch.manual_seed(0)
torch.cuda.manual_seed_all(0)
np.random.seed(0)
random.seed(0)
torch.cuda.manual_seed(0)
torch.backends.cudnn.enabled = False
Also, set the dataloader num_workers =0
Do you have any suggestions?
I would recommend to check the reproducibility docs additionally to the posts here.
Hi,
I did the following:
torch.manual_seed(56)
random.seed(56)
np.random.seed(56)
And initialized a linear layer nn.Linear(3,8).weight
.
Then re-iterating nn.Linear(3,8).weight
is giving me different weight
values.
I think this is why you guys are having fluctuations in your results.
I am using Pytorch 1.8.1
Any help from anybody…
Thanks.
Could you explain your use case a bit more?
If you are rerunning nn.Linear(3, 8).weight
, you’ll create new layers with new randomly initialized parameters, so different values are expected and necessary.
You are right.
I just want a way that initializes the same weight matrices of a layer in order to produce the same results after re-runs. This is my use case.
Thanks.
To get reproducible and deterministic results for the entire script, please take a look at the reproducibility docs, which were linked in my previous post.
Oh! I got it.
Thanks.