Is there a sequence data loader in pytorch?

Back in the days Torch had a nice collection of data loaders. In sequence modeling for getting the [25 x 32 x 100] tensor for RNN training I used SequenceLoader which accepted (sequence, batchsize, bidirectional) parameters where 25 is a sequence length, 32 is the batch size and 100 is the embedding size. Quickly skimming through current torch.utils.data docs I found nothing for sequence, maybe I am not seeing it but how do you do sequence batching in pytorch now?