Setting seed in torch DDP

Yes, in that case models on each rank would be initialized with different values.

However at startup time DDP broadcasts model parameters from rank 0 to ensure all ranks start training with the same model params, so setting a seed is not needed, unless you want determinism across different training runs with regard to the model params.

2 Likes