Does SyncBatchNorm work normally without DDP?

I have a network with SyncBatchNorm for multi-GPU training. If I train the network on a single GPU in a normal non-parallel way (i.e. without using DDP), do I need to change SyncBatchNorm to BatchNorm? Does SyncBatchNorm still work normally in this case? Thanks for your help!

It should fall back to the vanilla batchnorm operation if no distributed setup is detected as seen in this line of code.

1 Like