While or map or ... instead of for

Please give me an alternative way to speed up the following part of code:

    for i, data in enumerate(trainloader, 0):
        inputs, labels = data[0].to(device), data[1].to(device)
        optimizer.zero_grad()
        outputs = net(inputs)
        loss = criterion(outputs, labels)
        loss.backward()
        optimizer.step()
        running_loss += loss.item()

        if i == 10:
           break

To speed up the data loading, you could preload the complete dataset assuming that you won’t apply any on-the-fly data augmentation. Since the DataLoader can already use multiprocessing to load and process the data in the background while the model is training, I don’t see any way to speed it up using while or a map besides trying to remove potential bottlenecks in the data loading itself.