Multiprocessing / DDP - Barrier Blocks loss.backward()

Is this the same as Multiprocessing - Barrier Blocks all Processes?

BTW, for future questions on torch.distributed, could you please add a “distributed” tag. People working on the distributed package are monitoring that channel.