How to get deterministic behavior?


(David Lopes de Macêdo) #1

I am using:

cudnn.benchmark = False
cudnn.deterministic = True

random.seed(1)
numpy.random.seed(1)
torch.manual_seed(1)
torch.cuda.manual_seed(1)

And still not getting deterministic behavior…


#2

How large are the differences?
Could you provide a code snippet showing the error?

If the absolute error is approx. 1e-6 it might be due to the usage of float values.


(David Lopes de Macêdo) #3

(David Lopes de Macêdo) #4

The diferences are meanifull… Not of the order of 1e-6…

The diferences are about 1% more or less…


(David Lopes de Macêdo) #5

The problem is in the data augmentation transformations:

transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),

After I removed the above code, it worked 100% deterministically…

I am using Pytorch 0.3.1

Does torchvision have a different seed from torch?


(David Lopes de Macêdo) #6

The following appears to be the same issue:

If I use num_workers=0, I can get back the augmentation without losing the deterministic behavior, exactly as reported in the link above.


(Simon Wang) #7

This has been fixed in 0.4, provided you set random.seed in worker_init_fn.

Furthermore, you might want to set torch.backends.cudnn.deterministic=True


(David Lopes de Macêdo) #9

It worked. Thanks. Nevertheless, I think we should have a more straightforward way to get determinist behave. This “workaround” in the workers is not so intuitive for beginners.


(David Lopes de Macêdo) #10

Based on my tests, even in PyTorch 0.4 we still need to initialize the workers with the same seed to get deterministic behavior. The following lines are NOT enough:

cudnn.benchmark = False
cudnn.deterministic = True

random.seed(1)
numpy.random.seed(1)
torch.manual_seed(1)
torch.cuda.manual_seed(1)

I think this should not be the standard behavior. In my opinion, the above lines should be enough to provide deterministic behavior. It is not obvious to the novice that, besides the above lines, he also need to initialize the workers with the same seed to get deterministic behavior.


(Simon Wang) #11

The problem is numpy. We can’t assume that numpy exists so you’d have to set the seed for numpy in workers yourself.


(Nicholas Dronen) #12

IMO, worker_init_fn allows some flexibility, but why shouldn’t PyTorch’s workers have a reasonable default behavior? Something like the following code block, if it were to execute before worker_init_fn, would be backward compatible and would provide determinism out of the box, whether Numpy is installed or not.

try:
    import numpy
    torch_seed = torch.initial_seed()
    # Numpy expects unsigned integer seeds.
    np_seed = torch_seed // 2**32-1
    numpy.random.seed(np_seed)
except Exception:
    pass