Train Model with Multiple Dataloaders

I have some train-dataloaders saved on a directory, since each dataset is too big (image data) to save in a single dataloader, so I saved them selerately as belows;

path_dataloaders = [
./trainloaders/train_dataloat1, 
./trainloaders/train_dataloat2, 
./trainloaders/train_dataloat3..., 
]

Here, I would like to train a model with these multiple dataloaders.

# Get path to dataloaders
path_dataloaders = glob.glob('./trainloaders/*')

# Model instance
model = Net()
model.to(device)
# Loss function
criterion = nn.MSELoss()
# Optimizer
optimizer = optim.SGD(model.parameters(), lr=0.01)


# Train
def train(model):
    model.train()
    for data, label in train_loader:
        data, label = data.to(device), label.to(device)
        optimizer.zero_grad()
        output = model(data)
        loss = criterion(output, label)
        loss.backward()
        optimizer.step()


# Training!!
max_epoch = 100

for dl in tqdm(range(len(path_dataloaders))):
    train_loader = torch.load(path_dataloaders[dl])
    for epoch in range(max_epoch):
        train_ = train(model)

As as result, the training process looks work, but train loss is really unstable (sometime going down, but sometimes up).

Does anyone know if this approach is correct? or hope give me an advise of better way.
Thanks you in advance.

Iā€™m not sure how you are saving the DataLoaders. Are you storing the Datasets including all loaded samples?
If so, the approach should be working fine. Alternatively, you could of course also lazily load the data and use a single Dataset.

1 Like

Thank you for you comment.
My Datasets including all data, so it should be working fine.
My concern of unstable train loss might be due to another reason. Thanks anyway.