How can I wrap multiple transformer blocks in T5?

Answered on Github: Wrapping multiple layers in Pytorch FSDP · Issue #116986 · pytorch/pytorch · GitHub