Does DDP support multiple forward/backward

I want to train the model with multiple branch.
When DDP model is trained with multiple backward (using retain_graph), it still fails to update weights.
(its gradients are exploaded.)

For example, I wrote the code below.
scaler.scale(loss1).backward(retain_graph=True)
scaler.scale(loss2).backward()

If DDP support multiple forward/backward, could you suggest correct usage?