Keep seed constant for dataloader/transforms but change seed for weight initialization

Hi, I’m training a CNN on GPU with a custom dataset + transforms. I need it to be fully reproducible, so now I have set the random seed with torch.manual_seed(seed). I want to train+eval the CNN 10 times, with different weight initializations. As far as I understand, I can control these weight initializations with the random seed. However, I want to keep the seed for the dataloader/transforms constant.


  • Is it possible to change the seed temporarily only for model initialization, and then change it back to the “default” I use for the transforms?
  • Where exactly do I need to set the seed? My code is divided over multiple files/functions: a class for the CNN, a class for the data, function for the dataloaders. Training is divided into 3 functions (1 epoch, a general training function and a specific train + eval function for the current model I’m training), and finally the top-level script that runs the training+eval.

torch version 1.13, torchvision 0.14.0, python 3.9.13