FastaiLRFinder does not allow running more than 1 epoch

I have been trying to use the FastaiLRFinder to find the best learning rate for my module.

If we create the trainer with the function create_supervised_trainer as below:

trainer = create_supervised_trainer(
    model, optimizer, criterion, device, output_transform=custom_output_transform

and run it:

with lr_finder.attach(
            step_mode='exp') as lr_finder_training:

A warning will come up say: “UserWarning: Desired num_iter 50 is unreachable with the current run setup of 15 iteration (1 epochs) from ignite.contrib.handlers.param_scheduler import (LRScheduler, PiecewiseLinear)”

My dataloader has 15 batches to iterate, which means FastaiLRFinder does not allow you to run more than 1 epoch. Why?

According to their source code, we can see here that this verification limits the user to run more iterations than the maximum of epochs in their dataloader.

But why? Am I missing something important here?

Let’s discuss it here :
Most likely solves the issue.