Does elastic torch support model parallelism

In the docs torchrun (Elastic Launch) — PyTorch 2.7 documentation, I only see it used with DDP. I wonder if anyone knows whether it also supports model parallelism. Thanks

@fegin curious is model parallelism supported in torchrun?