Pytorch DataLoader type to TrainDataLoaderIter

Hi

I defined the dataloader using the following code.
once train_loader is defined, I would like to convert it ot TrainDataLoaderIter datatype to use next function to iterate the dataset:

train_iter = TrainDataLoaderIter(train_loader)
da = next(train_iter )

Here, da should have 4 tensors, however, I got only 2 tensors in it. why?
is there any other way to iterate DataLoader with next function?
or is there any other method to convert DataLoader to DataLoaderIter correctly?

class FeatureBuffer(IterableDataset):
    def __init__(self):
        super().__init__()
        pass
   
    def resize_fv(self):
      
        return feature_buff, y, y_dynamic, uLen
            
    def __len__(self):
        
		# len
        return len
            
    def __iter__(self):
        resize_fv)
    train_dataset = FeatureBuffer()
    test_dataset = FeatureBuffer()
    train_loader = DataLoader(dataset=train_dataset, batch_size=None, shuffle=False)
    test_loader = DataLoader(dataset=test_dataset, batch_size=None, shuffle=False)

I don’t completely understand you use case, but note that IterableDataset should return an iterator in __iter__ as given in the docs.

Your current code is using the __len__ method from a map-style Dataset so unsure what your requirements for this dataset are.
Here is a small example using your code as a base:

class FeatureBuffer(torch.utils.data.IterableDataset):
    def __init__(self):
        super().__init__()
   
    def resize_fv(self):
        return torch.randn(1), torch.randn(1), torch.randn(1), torch.randn(1)
            
    def __iter__(self):
        return iter([self.resize_fv() for _ in range(10)])
        
dataset = FeatureBuffer()
loader =torch.utils.data.DataLoader(dataset, batch_size=2)

for a, b, c, d in loader:
    print(a.shape, b.shape, c.shape, d.shape)

print(next(iter(loader)))