How to get deterministic behavior?


(David Lopes de Macêdo) #1

I am using:

cudnn.benchmark = False
cudnn.deterministic = True

random.seed(1)
numpy.random.seed(1)
torch.manual_seed(1)
torch.cuda.manual_seed(1)

And still not getting deterministic behavior…


#2

How large are the differences?
Could you provide a code snippet showing the error?

If the absolute error is approx. 1e-6 it might be due to the usage of float values.


(David Lopes de Macêdo) #3

(David Lopes de Macêdo) #4

The diferences are meanifull… Not of the order of 1e-6…

The diferences are about 1% more or less…


(David Lopes de Macêdo) #5

The problem is in the data augmentation transformations:

transforms.RandomCrop(32, padding=4),
transforms.RandomHorizontalFlip(),

After I removed the above code, it worked 100% deterministically…

I am using Pytorch 0.3.1

Does torchvision have a different seed from torch?


(David Lopes de Macêdo) #6

The following appears to be the same issue:

If I use num_workers=0, I can get back the augmentation without losing the deterministic behavior, exactly as reported in the link above.


(Simon Wang) #7

This has been fixed in 0.4, provided you set random.seed in worker_init_fn.

Furthermore, you might want to set torch.backends.cudnn.deterministic=True


(David Lopes de Macêdo) #9

It worked. Thanks. Nevertheless, I think we should have a more straightforward way to get determinist behave. This “workaround” in the workers is not so intuitive for beginners.