Why the loss is different even when I use torch.manual_seed(10)

In order to get the same result during training, I use torch.manual_seed(10) at the beginning of the main code, but it seems doesn’t work as I expected, because when I run the same training code, the loss image is different .

Check this
and Pytorch note on the subject.

There are more modules that you need to fix their seed.
You have to fix the seed of any module that may introduce randomness.
I recommend to start with num_workers=0 or 1 to use the main process (for the dataloader). Once you obtain reproducible results, you can move to > 1.

1 Like