Random seed initialization

Sounds like there is another question related here.

anyways, I think this can be a solution:

manualSeed = 1

np.random.seed(manualSeed)
random.seed(manualSeed)
torch.manual_seed(manualSeed)
# if you are suing GPU
torch.cuda.manual_seed(manualSeed)
torch.cuda.manual_seed_all(manualSeed)


torch.backends.cudnn.enabled = False 
torch.backends.cudnn.benchmark = False
torch.backends.cudnn.deterministic = True

also in the dataloader i set num_workers = 0

based on here
you also need to change worker_init_fn as :

def _init_fn():
    np.random.seed(manualSeed)
    

DataLoding = data.DataLoader(..., batch_size = ..., 
                             collate_fn = ..., 
                             num_workers =..., 
                             shuffle = ..., 
                             pin_memory = ...,
                             worker_init_fn=_init_fn)


I noticed if we dont do torch.backends.cudnn.enabled = False the results are very close, but some times not match :hushed:
p.s. im using pytorch 1.0.1

6 Likes