Error With Sequential Sampler

When dataloader iterates through the below loader ,i get an error saying
index 307 is out of bounds for axis 0 with size 307
but with RandomSampler i dont get any error. size of valid set array is 307 .

for tr_idx, val_idx in tqdm(kf.split(z)):

    train_dataset = DatasetRetriever(
        train_arrays=z[tr_idx],#.index.values,
        targets= y[tr_idx]

    )

    valid_dataset = DatasetRetriever(
        train_arrays=z[val_idx],#.index.values,
        targets= y[val_idx]

    )

valid_loader = torch.utils.data.DataLoader(
        valid_dataset,
        batch_size=128,
        sampler=SequentialSampler(train_dataset),
        #sampler=RandomSampler(train_dataset),
        pin_memory=False,
        drop_last=False,
        num_workers=1
        #collate_fn=collate_fn,
    )

Below is result of print statement from Dataset Get

0 (307,10)
...
...
301 (307, 10)
302 (307, 10)
303 (307, 10)
304 (307, 10)
305 (307, 10)
306 (307, 10)
307 (307, 10)
384 (307, 10)
512 (307, 10)

i dont know how 512 ,384 indices are coming up here above.

I’m guessing that train_dataset should be valid_dataset since that is your sampler is for the validation set. That index is likely one that is only applicable to the training dataset which is larger.