Reset parameters during k-fold cross validation

I would like to know whether is needed to reset the learning rate scheduler, the optimizer parameters, and the CUDA cache on every Kth iteration during k-fold cross validation.

Currently, I am using this code:

    for fold, (train_idx,test_idx) in enumerate(splits.split(np.arange(len(dataset)))):
        .
        .
        # Reset params
        net.apply(reset_wgts)
        optimizer = OPTIMIZER 
        scheduler = SCHEDULER
        .
        .
        for epoch in range(NUM_EPOCHS):