I am loading from several Dataloaders at once, which means I can’t do

```
for batches, labels in dataloader
```

I really need something like

```
batches, labels = dataloader.next()
```

Anyone provide a solution?

Thanks

I am loading from several Dataloaders at once, which means I can’t do

```
for batches, labels in dataloader
```

I really need something like

```
batches, labels = dataloader.next()
```

Anyone provide a solution?

Thanks

try this

```
batch = next(iter(dataloader))
input, target = batch
```

in this way i believe you can only access a single batch of data, this is more

effecient than for loop if you want to view a single batch of data

In case both datasets are of the same size you might also zip them and iterate them using a `for`

loop as such:

```
from torch.utils.data import DataLoader
ds1 = [0, 1, 2, 3]
ds2 = [10, 11, 12, 13]
dl1 = DataLoader(ds1)
dl2 = DataLoader(ds2)
dl1_iter = iter(dl1)
dl2_iter = iter(dl2)
for s1, s2 in zip(dl1_iter, dl2_iter):
print(s1)
print(s2)
```

This yields:

```
tensor([0])
tensor([10])
tensor([1])
tensor([11])
tensor([2])
tensor([12])
tensor([3])
tensor([13])
```