Cross-Validation and Epoch

I am trying to understand the relationship between cross-validation and number of epoch.

Let’s say I am using 10 fold cross-validation, and now I set the number of epoch = 100.

Does that mean my model will run through the whole data by 1000 times? Then, the actual number epoch became 1000?

In the basic approach, called k-fold CV, the training set is split into k smaller sets (other approaches are described below, but generally follow the same principles). The following procedure is followed for each of the k “folds”:

  • A model is trained using k−1 of the folds as training data;
  • the resulting model is validated on the remaining part of the data (i.e., it is used as a test set to compute a performance measure such as accuracy).

So my answer is that the whole model will see your 10 - 1 th fold data which gets shuffled 100 times while training
and the last fold for validation (100 times) in hundred epochs.

i am sure one dataset will be executed 10 times in one epoch coz of 10 fold.

then it contradicts the actual meaning of an epoch.
An epoch in machine learning means one complete pass of the training dataset through the algorithm.

True, that’s why I am feeling confused. :slight_smile: