Setting seed in torch DDP

Hi all,

I’m confused about how to set the seed in torch DDP. I know that the models must be initialized to the same parameter values across processes, so the seed must be the same.

Looking at the code from the DeiT repo I see the line seed = args.seed + utils.get_rank().

Doesn’t this mean we have a different seed for each process? So wouldn’t the models in each process be initialized differently?

Yes, in that case models on each rank would be initialized with different values.

However at startup time DDP broadcasts model parameters from rank 0 to ensure all ranks start training with the same model params, so setting a seed is not needed, unless you want determinism across different training runs with regard to the model params.

2 Likes