ValueError: n_splits=5 cannot be greater than the number of members in each class

Using Skorch for the first time for hyperparameter turning a neural network made with PyTorch and having issues in getting it to run. I have used this as a guide: Hyperparameter Search with PyTorch and Skorch

However I have used my own dataset, consisting of 7 classes, with a total dataset size of 10292 images, with the smallest class having 480 images. All help I have found online mentions

StratifiedKFold

However I have not used this within the code. The Skorch part is as below:

    net = NeuralNetClassifier(
        module = CNN,
        criterion = nn.CrossEntropyLoss,
        optimizer = torch.optim.Adam,
        max_epochs = EPOCHS,
        lr = LR,
        verbose = False
        )

    param_grid = {
        'module__dropout_rate': [0.0, 0.16, 0.21, 0.28, 0.12, 0.04, 0.09],
        }
    grid = GridSearchCV(net, param_grid, refit=False, n_jobs = -1, cv = 2)

    search_batches = 1
    counter = 0
    for i, data in enumerate(trainDataLoader):
        counter += 1
        image, labels = data
        image = image.to(device)
        labels = labels.to(device)
        outputs = grid.fit(image, labels)
        if counter == search_batches:
            break

Any help on how to help solve this issue and get some solid hyperparameter tuning on the go would be much appreciated!

First time poster.

t seems you are trying to run the grid search for hyperparameters using Skorch inside the training loop, which is incorrect. You should perform the grid search before training the model, and the input should be the entire dataset rather than individual batches.
After running the grid search, you can use the best hyperparameters found to train your model.