Sampler to sample contiguous data?

I’m wondering if it’s possible to use any of the existing Sampler classes or a custom class to do the following: I wish to have the DataLoader shuffle data but in contiguous units of size [batch_size]. For example if my batch size is 10, I’d have the DataLoader load
-> data[10-20] -> data[40-50] -> data[0-10] -> …
or
-> data[50-60], data[10-20], data[0-10] -> … etc

If I try to use the dataloader shuffle as it is, the data output is not contiguous. It will just be 10 random indices. Is there any way to enforce this?

Do you want overlapping windows or unique ones?
If overlapping is OK, you could just use the shuffled indices and slice your data.
Otherwise you could use something like this:

class MyDataset(Dataset):
    def __init__(self, window=10):
        self.data = torch.arange(100).view(-1, 1).expand(-1, 10)
        self.window = window
        
    def __getitem__(self, index):
        index = index * self.window
        x = self.data[index:index+self.window]
        return x
    
    def __len__(self):
        return len(self.data) / self.window

dataset = MyDataset(window=10)
print(dataset[0])